wget???

Lesenswerte Artikel, Anleitungen und Diskussionen
Anonymous

wget???

Post by Anonymous » 2003-12-10 19:50

So, nachdem ich mir einen PC mit SuSE Linux 8.2 installiert habe (Minimale Installation, also nicht grafisch), kann ich den wget Befehl nicht mehr benutzen.

--------------------

linux:~ # wget http://www.google.de
-bash: wget: command not found

--------------------
Wie kann ich den wieder installieren oder gibts noch eine andere Möglichkeit, dateien runterzuladen?

:?:

sgg

mcdave2k1
Posts: 31
Joined: 2003-10-29 22:37
Location: Monheim

Re: wget???

Post by mcdave2k1 » 2003-12-10 20:51

dann installier wget einfach mal...

User avatar
Joe User
Project Manager
Project Manager
Posts: 11583
Joined: 2003-02-27 01:00
Location: Hamburg

Re: wget???

Post by Joe User » 2003-12-10 20:59

Code: Select all

su - root
mount /dev/hdb /media/cdrom
rpm -ivh /media/cdrom/suse/i586/wget-*.i586.rpm
umount /dev/hdb
SuSEconfig --verbose
exit
PayPal.Me/JoeUserFreeBSD Remote Installation
Wings for LifeWings for Life World Run

„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.

Anonymous

danke

Post by Anonymous » 2003-12-10 21:53

Vielen Dank! :-D

flo
RSAC
Posts: 2297
Joined: 2002-07-28 13:02
Location: Berlin

Re: wget???

Post by flo » 2003-12-11 01:05

sgg wrote:... oder gibts noch eine andere Möglichkeit, dateien runterzuladen?
curl gibt es auch noch, genauso lynx, beide auf einem Server empfehlenswert.

Grüße,

flo.

User avatar
Joe User
Project Manager
Project Manager
Posts: 11583
Joined: 2003-02-27 01:00
Location: Hamburg

Re: wget???

Post by Joe User » 2003-12-11 10:05

Oder man macht es per bash ;)
PayPal.Me/JoeUserFreeBSD Remote Installation
Wings for LifeWings for Life World Run

„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.

flo
RSAC
Posts: 2297
Joined: 2002-07-28 13:02
Location: Berlin

Re: wget???

Post by flo » 2003-12-11 11:36

Joe User wrote:Oder man macht es per bash ;)
?????

Ja, gut, bash ist ok, besser als tcsh, aber was genau meinst Du?

flo.

User avatar
Joe User
Project Manager
Project Manager
Posts: 11583
Joined: 2003-02-27 01:00
Location: Hamburg

Re: wget???

Post by Joe User » 2003-12-11 15:57

Ich meine HTTP per bash :roll:
PayPal.Me/JoeUserFreeBSD Remote Installation
Wings for LifeWings for Life World Run

„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.

flo
RSAC
Posts: 2297
Joined: 2002-07-28 13:02
Location: Berlin

Re: wget???

Post by flo » 2003-12-11 17:45

Halt mich jetzt nicht für blöd, aber bitte, was meinst Du mit http per bash?

Das ist für mich der Aufruf eines externen Programms ...

wget und curl zum saugen
lynx und w3m zum browsen

Aber mit "http per bash" kann ich nichts anfangen?

Klär mich bitte auf ... ;-)

Grüße,

flo.

darkspirit
RSAC
Posts: 568
Joined: 2002-10-05 16:39
Location: D'dorf

Re: wget???

Post by darkspirit » 2003-12-11 17:51

Unterste Ebene wäre für mich Telnet/Netcat. Hat die Bash selbst Netzwerkfähigkeit?

User avatar
Joe User
Project Manager
Project Manager
Posts: 11583
Joined: 2003-02-27 01:00
Location: Hamburg

Re: wget???

Post by Joe User » 2003-12-11 18:28

Quelle: http-get aus den LFS-Utils (Download)

Code: Select all

#!/bin/bash

# This script is designed to be a primitive web/ftp client, to pull a file
# from somewhere on the Internet when you have nothing but the base
# LFS. Generally, it's recommended that you use it one time -- to pull
# a Web or FTP client to get the rest of the stuff you need.

# If somebody doesn't know how to call the script, tell them.
if [ x$1 = "x" ]
  then echo "USAGE: $0 <URL>"
fi

# Pick apart the URL, and break it into the pieces we need.
URL=${1##http://} &&
SERVER=${URL%%/*} &&
FULLFILENAME=${URL#$SERVER} &&
FILENAME=${FULLFILENAME##*/}

echo -n "Fetching $SERVER$FULLFILENAME... "

#echo URL=$URL
#echo SERVER=$SERVER
#echo FULLNAME=$FULLFILENAME
#echo FILENAME=$FILENAME

# There are two important (and somewhat unusual pieces to this
# next command (or series of commands -- it gets tricky trying to
# say which it is.)

# The first is the "3<>" part. This says to open file descriptor
# 3 for BOTH input and output.
# The second important piece is "/dev/tcp". There is no actual entry in /dev
# for this -- it's a construct of bash (actually, more properly
# it's a construct of readline). It allows you to do TCP interactions
# with any machine. The format is "/dev/tcp/<address>/<port>". <address> can
# be either an IP address or a hostname. A hostname gets resolved just like
# it does with any other TCP/IP application. Since we are using HTTP
# in this case, the port will be 80.

# First, we send the HTTP command to file descriptor 3. Note the "&" on
# the end. Without that, we'll never get out of the command so that we
# can execute the next one.
(echo -e "GET $FULLFILENAME HTTP/0.9rnrn" 1>&3 &

# Just like we rerouted stdout (file descriptor 1) to FD3 in the last
# command, we reroute stdin (FD0) from FD3 in this one.
# We also tell bash that we want to open FD3 for both input and output
# at the same time.
# Since our earlier command was written to it, what happens once FD3
# is built and associated with the remote server, is that that server
# gets sent "GET <path/filname> HTTP/0.9". The server very obediently
# sends back that file.
cat 0<&3) 3<> /dev/tcp/$SERVER/80 |
 (
  # The "(" and ")" around all of these statements form a subshell.
  # One reason it is done is that it only requires one output
  # statement to handle all the commands in the subshell. In this
  # case, it has a more important purpose: it allows us to feed info
  # to all the commands with one INPUT. (The pipe, which we can only
  # have one of.
  # Without the subshell, the pipe would read into the first "read i",
  # and that's all that would ever happen.

  # Prime the variable "i" so the while will work properly.
  read i
  # As long as there's HTTP text coming back from the GET request, keep
  # reading.
  while [ x"$(echo $i | tr -d 'r')" != "x" ]
  do
    read i
  done
  # Copy whatever's coming in (from fd3) to stdout (which is about to
  # be redirected to $FILENAME).
  cat
 ) >$FILENAME 

echo "done."
man bash
PayPal.Me/JoeUserFreeBSD Remote Installation
Wings for LifeWings for Life World Run

„If there’s more than one possible outcome of a job or task, and one
of those outcomes will result in disaster or an undesirable consequence,
then somebody will do it that way.“ -- Edward Aloysius Murphy Jr.

flo
RSAC
Posts: 2297
Joined: 2002-07-28 13:02
Location: Berlin

Re: wget???

Post by flo » 2003-12-11 19:00

Hut ab - verstanden habe ichs ja ...

... aber komm da erst mal drauf ... :-)

Danke und Grüße,

flo.

darkspirit
RSAC
Posts: 568
Joined: 2002-10-05 16:39
Location: D'dorf

Re: wget???

Post by darkspirit » 2003-12-11 21:57

Interessant.. danke auch von mir.. man lernt nie aus ;)