Cara Mudah Install wget di Terminal Mac OS
Mukharom.com – Wget merupakan program aplikasi berbasis text (CLI) yang sering difungsikan untuk mendownload sebuah file melalui berbagai protokol. Wget mampu melakukan pengunduhan sebuah file menggunakan protokol FTP, HTTP dan HTTPS. Saya merupakan salah satu orang yang senang menggunakan wget untuk melakukan download file tersebut melalui terminal, terutama di linux. Melakukan pengunduhan file menggunakan wget terasa cukup stabil dan fleksibel, saya tidak perlu takut harus mengunduh ulang ketika koneksi terputus karena dapat diteruskan kembali.
Lama tak menggunakan Linux sebagai daily driver pada notebook saya karena sudah nyaman menggunakan Mac OS, saya pun agak kebingungan ketika ingin mendownload di terminal. Pasalnya file yang saya akan unduh ini ukurannya cukup besar, dan saya tidak ingin download menggunakan browser karena akan memakan memory yang cukup banyak. Begitupun download manager, saya malas menginstall aplikasi yang jarang saya gunakan. Akhirnya saya pun terpikir untuk menggunakan wget di terminal, namun sialnya ketika saya ketikkan di terminal ternyata wget secara default belum terinstall. Huft.
Kemudian saya melakukan pencarian di internet, namun instalasinya harus menggunakan Hombrew dan sedangkan saya belum mengunstallnya. Huft(1). Untungnya rekan kerja saya merupakan suhu sekaligus master linux yang pernah menjuarai LKS tingkat Nasional yakni om Ahmad. Akhirnya saya dibantu beliau untuk menginstallnya, bagaimana sih cara install wget untuk download di Terminal Mac OS atau Macbook tersebut? Caranya mudah sekali, mari simak Tutorial Cara Mudah Instalasi wget di Terminal Machintos.
Tutorial Install wget di Terminal Mac OS
Hal pertama yang perlu kamu lakukan adalah membuka terminal pada Mac OS yang kamu gunakan. Jika sudah silakan login dengan user root dan pindah direktori tempat kamu berada ke /usr/local/bin/ dengan perintah cd . Jika bingung kamu dapat ikuti tahapan berikut
1 2 3 |
Fajars-MacBook-Air:~ fajarmukharom$ sudo su Password: sh-3.2# cd /usr/local/bin/ |
Kemudian jika kamu sudah berhasil masuk ke direktori /usr/local/bin/ tersebut, langkah selanjutnya adalah melakukan download wget melalui terminal. Loh kok download via terminal? Bukannya wget ini fungsinya untuk mendownload?
Untuk mendownload wget tersebut saya menggunakan bantuan curl, namun karena saya lebih menyukai wget maka saya lebih memilih menggunakan wget untuk keseharian. Maafkan aku ya curl 🙁 . Untuk melakukan download wget, kamu cukup jalankan perintah curl -O https://mukharom.com/wget . Maka wget akan terdownload ke direktori tersebut
1 2 3 4 |
sh-3.2# curl -O https://mukharom.com/wget % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 701k 100 701k 0 0 889k 0 --:--:-- --:--:-- --:--:-- 888k |
Proses download berlangsung cepat, karena file tersebut berukuran sangat minimalis. Proses selanjutnya adalah memberikan akses file tersebut agar dapat dieksekusi. Cukup ketikkan perintah chmod +x wget kemudian enter
1 |
sh-3.2# chmod +x wget |
Untuk melakukan download file menggunakan wget di terminal cukup ketikkan wget http://url maka file akan terdownload. Jika kesulitan atau ingin tau perintah apa saja yang tersedia di wget tersebut, kamu bisa memanfaatkan help dengan perintah wget –help
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 |
sh-3.2# wget --help GNU Wget 1.16.3, a non-interactive network retriever. Usage: wget [OPTION]... [URL]... Mandatory arguments to long options are mandatory for short options too. Startup: -V, --version display the version of Wget and exit -h, --help print this help -b, --background go to background after startup -e, --execute=COMMAND execute a `.wgetrc'-style command Logging and input file: -o, --output-file=FILE log messages to FILE -a, --append-output=FILE append messages to FILE -d, --debug print lots of debugging information -q, --quiet quiet (no output) -v, --verbose be verbose (this is the default) -nv, --no-verbose turn off verboseness, without being quiet --report-speed=TYPE output bandwidth as TYPE. TYPE can be bits -i, --input-file=FILE download URLs found in local or external FILE -F, --force-html treat input file as HTML -B, --base=URL resolves HTML input-file links (-i -F) relative to URL --config=FILE specify config file to use --no-config do not read any config file Download: -t, --tries=NUMBER set number of retries to NUMBER (0 unlimits) --retry-connrefused retry even if connection is refused -O, --output-document=FILE write documents to FILE -nc, --no-clobber skip downloads that would download to existing files (overwriting them) -c, --continue resume getting a partially-downloaded file --start-pos=OFFSET start downloading from zero-based position OFFSET --progress=TYPE select progress gauge type --show-progress display the progress bar in any verbosity mode -N, --timestamping don't re-retrieve files unless newer than local --no-use-server-timestamps don't set the local file's timestamp by the one on the server -S, --server-response print server response --spider don't download anything -T, --timeout=SECONDS set all timeout values to SECONDS --dns-timeout=SECS set the DNS lookup timeout to SECS --connect-timeout=SECS set the connect timeout to SECS --read-timeout=SECS set the read timeout to SECS -w, --wait=SECONDS wait SECONDS between retrievals --waitretry=SECONDS wait 1..SECONDS between retries of a retrieval --random-wait wait from 0.5*WAIT...1.5*WAIT secs between retrievals --no-proxy explicitly turn off proxy -Q, --quota=NUMBER set retrieval quota to NUMBER --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host --limit-rate=RATE limit download rate to RATE --no-dns-cache disable caching DNS lookups --restrict-file-names=OS restrict chars in file names to ones OS allows --ignore-case ignore case when matching files/directories -4, --inet4-only connect only to IPv4 addresses -6, --inet6-only connect only to IPv6 addresses --prefer-family=FAMILY connect first to addresses of specified family, one of IPv6, IPv4, or none --user=USER set both ftp and http user to USER --password=PASS set both ftp and http password to PASS --ask-password prompt for passwords --no-iri turn off IRI support --local-encoding=ENC use ENC as the local encoding for IRIs --remote-encoding=ENC use ENC as the default remote encoding --unlink remove file before clobber Directories: -nd, --no-directories don't create directories -x, --force-directories force creation of directories -nH, --no-host-directories don't create host directories --protocol-directories use protocol name in directories -P, --directory-prefix=PREFIX save files to PREFIX/.. --cut-dirs=NUMBER ignore NUMBER remote directory components HTTP options: --http-user=USER set http user to USER --http-password=PASS set http password to PASS --no-cache disallow server-cached data --default-page=NAME change the default page name (normally this is 'index.html'.) -E, --adjust-extension save HTML/CSS documents with proper extensions --ignore-length ignore 'Content-Length' header field --header=STRING insert STRING among the headers --max-redirect maximum redirections allowed per page --proxy-user=USER set USER as proxy username --proxy-password=PASS set PASS as proxy password --referer=URL include 'Referer: URL' header in HTTP request --save-headers save the HTTP headers to file -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION --no-http-keep-alive disable HTTP keep-alive (persistent connections) --no-cookies don't use cookies --load-cookies=FILE load cookies from FILE before session --save-cookies=FILE save cookies to FILE after session --keep-session-cookies load and save session (non-permanent) cookies --post-data=STRING use the POST method; send STRING as the data --post-file=FILE use the POST method; send contents of FILE --method=HTTPMethod use method "HTTPMethod" in the request --body-data=STRING send STRING as data. --method MUST be set --body-file=FILE send contents of FILE. --method MUST be set --content-disposition honor the Content-Disposition header when choosing local file names (EXPERIMENTAL) --content-on-error output the received content on server errors --auth-no-challenge send Basic HTTP authentication information without first waiting for the server's challenge HTTPS (SSL/TLS) options: --secure-protocol=PR choose secure protocol, one of auto, SSLv2, SSLv3, TLSv1 and PFS --https-only only follow secure HTTPS links --no-check-certificate don't validate the server's certificate --certificate=FILE client certificate file --certificate-type=TYPE client certificate type, PEM or DER --private-key=FILE private key file --private-key-type=TYPE private key type, PEM or DER --ca-certificate=FILE file with the bundle of CAs --ca-directory=DIR directory where hash list of CAs is stored --crl-file=FILE file with bundle of CRLs --random-file=FILE file with random data for seeding the SSL PRNG --egd-file=FILE file naming the EGD socket with random data FTP options: --ftp-user=USER set ftp user to USER --ftp-password=PASS set ftp password to PASS --no-remove-listing don't remove '.listing' files --no-glob turn off FTP file name globbing --no-passive-ftp disable the "passive" transfer mode --preserve-permissions preserve remote file permissions --retr-symlinks when recursing, get linked-to files (not dir) WARC options: --warc-file=FILENAME save request/response data to a .warc.gz file --warc-header=STRING insert STRING into the warcinfo record --warc-max-size=NUMBER set maximum size of WARC files to NUMBER --warc-cdx write CDX index files --warc-dedup=FILENAME do not store records listed in this CDX file --no-warc-compression do not compress WARC files with GZIP --no-warc-digests do not calculate SHA1 digests --no-warc-keep-log do not store the log file in a WARC record --warc-tempdir=DIRECTORY location for temporary files created by the WARC writer Recursive download: -r, --recursive specify recursive download -l, --level=NUMBER maximum recursion depth (inf or 0 for infinite) --delete-after delete files locally after downloading them -k, --convert-links make links in downloaded HTML or CSS point to local files --backups=N before writing file X, rotate up to N backup files -K, --backup-converted before converting file X, back up as X.orig -m, --mirror shortcut for -N -r -l inf --no-remove-listing -p, --page-requisites get all images, etc. needed to display HTML page --strict-comments turn on strict (SGML) handling of HTML comments Recursive accept/reject: -A, --accept=LIST comma-separated list of accepted extensions -R, --reject=LIST comma-separated list of rejected extensions --accept-regex=REGEX regex matching accepted URLs --reject-regex=REGEX regex matching rejected URLs --regex-type=TYPE regex type (posix) -D, --domains=LIST comma-separated list of accepted domains --exclude-domains=LIST comma-separated list of rejected domains --follow-ftp follow FTP links from HTML documents --follow-tags=LIST comma-separated list of followed HTML tags --ignore-tags=LIST comma-separated list of ignored HTML tags -H, --span-hosts go to foreign hosts when recursive -L, --relative follow relative links only -I, --include-directories=LIST list of allowed directories --trust-server-names use the name specified by the redirection URL's last component -X, --exclude-directories=LIST list of excluded directories -np, --no-parent don't ascend to the parent directory |
Jika sudah silakan coba gunakan wget tersebut untuk mengunduh file yang ada di internet. Saya biasanya menggunakan perintah wget -c http://url agar dapat diteruskan kembali jika koneksi terputus.
Bagaimana sangat mudah bukan melakukan instalasi wget di terminal mac OS? Akhirnya bisa download di terminal lagi menggunakan wget 😀 . Dengan begitu bisa download file besar di kantor tanpa mencolok karena tidak terlihat di browser, biar keliatan lagi kerja juga buka terminal hahahaha 😀
Have a good day!
halo, saya mau install bebasid di macbook saya cuma ga ngerti karena perintah sudo wget bla bla bla. saya googling ketemu blog ini. saya coba install wget, di step pertama hasilnya ini
sh-3.2# cd /usr/local/bin/
sh: cd: /usr/local/bin/: No such file or directory
sh-3.2#
itu maksudnya apa dan gimana ya seharusnya? makasih sebelumnya
Hi, apakah sudah mengetikkan perintah “sudo su” sebelumnya? Dan untuk MacBooknya saat ini menggunakan MacOS tipe apa?
Untuk perintah “cd /usr/local/bin/” fungsinya untuk memindahkan ke direktory sistem tepatnya ke /usr/local/bin/, agar paket yang diunduh dapat dieksekusi dan dipanggil langsung melalui perintah di terminal.
terima kasih atas responsenya kak fajar. saya sudah mengetikkan perintah “sudo su” dan hasilnya seperti yang saya sampaikan. macos catalina 10.15.4
kalo gitu maksudnya apakah yang tidak ada yang saya download/install? bagaimana caranya supaya saya bisa install wget jika kondisinya seperti itu?