wget???
wget???
So, nachdem ich mir einen PC mit SuSE Linux 8.2 installiert habe (Minimale Installation, also nicht grafisch), kann ich den wget Befehl nicht mehr benutzen.
--------------------
linux:~ # wget http://www.google.de
-bash: wget: command not found
--------------------
Wie kann ich den wieder installieren oder gibts noch eine andere Möglichkeit, dateien runterzuladen?
:?:
sgg
--------------------
linux:~ # wget http://www.google.de
-bash: wget: command not found
--------------------
Wie kann ich den wieder installieren oder gibts noch eine andere Möglichkeit, dateien runterzuladen?
:?:
sgg
-
- Posts: 31
- Joined: 2003-10-29 22:37
- Location: Monheim
Re: wget???
dann installier wget einfach mal...
-
- Project Manager
- Posts: 11139
- Joined: 2003-02-27 01:00
- Location: Hamburg
Re: wget???
Code: Select all
su - root
mount /dev/hdb /media/cdrom
rpm -ivh /media/cdrom/suse/i586/wget-*.i586.rpm
umount /dev/hdb
SuSEconfig --verbose
exit
PayPal.Me/JoeUser ● FreeBSD Remote Installation
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
-
- Posts: 2223
- Joined: 2002-07-28 13:02
- Location: Berlin
Re: wget???
curl gibt es auch noch, genauso lynx, beide auf einem Server empfehlenswert.sgg wrote:... oder gibts noch eine andere Möglichkeit, dateien runterzuladen?
Grüße,
flo.
-
- Project Manager
- Posts: 11139
- Joined: 2003-02-27 01:00
- Location: Hamburg
Re: wget???
Oder man macht es per bash ;)
PayPal.Me/JoeUser ● FreeBSD Remote Installation
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
-
- Posts: 2223
- Joined: 2002-07-28 13:02
- Location: Berlin
Re: wget???
?????Joe User wrote:Oder man macht es per bash ;)
Ja, gut, bash ist ok, besser als tcsh, aber was genau meinst Du?
flo.
-
- Project Manager
- Posts: 11139
- Joined: 2003-02-27 01:00
- Location: Hamburg
Re: wget???
Ich meine HTTP per bash :roll:
PayPal.Me/JoeUser ● FreeBSD Remote Installation
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
-
- Posts: 2223
- Joined: 2002-07-28 13:02
- Location: Berlin
Re: wget???
Halt mich jetzt nicht für blöd, aber bitte, was meinst Du mit http per bash?
Das ist für mich der Aufruf eines externen Programms ...
wget und curl zum saugen
lynx und w3m zum browsen
Aber mit "http per bash" kann ich nichts anfangen?
Klär mich bitte auf ...
Grüße,
flo.
Das ist für mich der Aufruf eines externen Programms ...
wget und curl zum saugen
lynx und w3m zum browsen
Aber mit "http per bash" kann ich nichts anfangen?
Klär mich bitte auf ...
Grüße,
flo.
-
- Posts: 553
- Joined: 2002-10-05 16:39
- Location: D'dorf
Re: wget???
Unterste Ebene wäre für mich Telnet/Netcat. Hat die Bash selbst Netzwerkfähigkeit?
-
- Project Manager
- Posts: 11139
- Joined: 2003-02-27 01:00
- Location: Hamburg
Re: wget???
Quelle: http-get aus den LFS-Utils (Download)
man bash
Code: Select all
#!/bin/bash
# This script is designed to be a primitive web/ftp client, to pull a file
# from somewhere on the Internet when you have nothing but the base
# LFS. Generally, it's recommended that you use it one time -- to pull
# a Web or FTP client to get the rest of the stuff you need.
# If somebody doesn't know how to call the script, tell them.
if [ x$1 = "x" ]
then echo "USAGE: $0 <URL>"
fi
# Pick apart the URL, and break it into the pieces we need.
URL=${1##http://} &&
SERVER=${URL%%/*} &&
FULLFILENAME=${URL#$SERVER} &&
FILENAME=${FULLFILENAME##*/}
echo -n "Fetching $SERVER$FULLFILENAME... "
#echo URL=$URL
#echo SERVER=$SERVER
#echo FULLNAME=$FULLFILENAME
#echo FILENAME=$FILENAME
# There are two important (and somewhat unusual pieces to this
# next command (or series of commands -- it gets tricky trying to
# say which it is.)
# The first is the "3<>" part. This says to open file descriptor
# 3 for BOTH input and output.
# The second important piece is "/dev/tcp". There is no actual entry in /dev
# for this -- it's a construct of bash (actually, more properly
# it's a construct of readline). It allows you to do TCP interactions
# with any machine. The format is "/dev/tcp/<address>/<port>". <address> can
# be either an IP address or a hostname. A hostname gets resolved just like
# it does with any other TCP/IP application. Since we are using HTTP
# in this case, the port will be 80.
# First, we send the HTTP command to file descriptor 3. Note the "&" on
# the end. Without that, we'll never get out of the command so that we
# can execute the next one.
(echo -e "GET $FULLFILENAME HTTP/0.9rnrn" 1>&3 &
# Just like we rerouted stdout (file descriptor 1) to FD3 in the last
# command, we reroute stdin (FD0) from FD3 in this one.
# We also tell bash that we want to open FD3 for both input and output
# at the same time.
# Since our earlier command was written to it, what happens once FD3
# is built and associated with the remote server, is that that server
# gets sent "GET <path/filname> HTTP/0.9". The server very obediently
# sends back that file.
cat 0<&3) 3<> /dev/tcp/$SERVER/80 |
(
# The "(" and ")" around all of these statements form a subshell.
# One reason it is done is that it only requires one output
# statement to handle all the commands in the subshell. In this
# case, it has a more important purpose: it allows us to feed info
# to all the commands with one INPUT. (The pipe, which we can only
# have one of.
# Without the subshell, the pipe would read into the first "read i",
# and that's all that would ever happen.
# Prime the variable "i" so the while will work properly.
read i
# As long as there's HTTP text coming back from the GET request, keep
# reading.
while [ x"$(echo $i | tr -d 'r')" != "x" ]
do
read i
done
# Copy whatever's coming in (from fd3) to stdout (which is about to
# be redirected to $FILENAME).
cat
) >$FILENAME
echo "done."
PayPal.Me/JoeUser ● FreeBSD Remote Installation
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
Wings for Life ● Wings for Life World Run
„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.
-
- Posts: 2223
- Joined: 2002-07-28 13:02
- Location: Berlin
Re: wget???
Hut ab - verstanden habe ichs ja ...
... aber komm da erst mal drauf ...
Danke und Grüße,
flo.
... aber komm da erst mal drauf ...
Danke und Grüße,
flo.
-
- Posts: 553
- Joined: 2002-10-05 16:39
- Location: D'dorf
Re: wget???
Interessant.. danke auch von mir.. man lernt nie aus ;)