<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	
	>
<channel>
	<title>
	Comments on: Wget and User-Agent Header	</title>
	<atom:link href="https://www.krazyworks.com/wget-and-user-agent-header/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.krazyworks.com/wget-and-user-agent-header/</link>
	<description>Networking, Systems Design, and Disaster Recovery</description>
	<lastBuildDate>Mon, 20 Mar 2017 12:44:33 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.7.9</generator>
	<item>
		<title>
		By: Wget Examples &#124; KrazyWorks		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-275011</link>

		<dc:creator><![CDATA[Wget Examples &#124; KrazyWorks]]></dc:creator>
		<pubDate>Mon, 20 Mar 2017 12:44:33 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-275011</guid>

					<description><![CDATA[[&#8230;] is a follow-up to my previous wget&#160;notes (1, 2, 3, 4). From time to time I find myself googling wget&#160;syntax even though I think I&#8217;ve used [&#8230;]]]></description>
			<content:encoded><![CDATA[<p>[&#8230;] is a follow-up to my previous wget&nbsp;notes (1, 2, 3, 4). From time to time I find myself googling wget&nbsp;syntax even though I think I&#8217;ve used [&#8230;]</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: RuMKilleR		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-248186</link>

		<dc:creator><![CDATA[RuMKilleR]]></dc:creator>
		<pubDate>Mon, 01 Apr 2013 10:47:24 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-248186</guid>

					<description><![CDATA[I have a problem with wget as the links are relative and i want them to be absolute.
I mean i am caching recursively and i want a link let&#039;s say ../index.php like that in the page:
http://www.original_domain/index.php
]]></description>
			<content:encoded><![CDATA[<p>I have a problem with wget as the links are relative and i want them to be absolute.<br />
I mean i am caching recursively and i want a link let&#8217;s say ../index.php like that in the page:<br />
<a href="http://www.original_domain/index.php" rel="nofollow ugc">http://www.original_domain/index.php</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: kiltakblog		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-247828</link>

		<dc:creator><![CDATA[kiltakblog]]></dc:creator>
		<pubDate>Sat, 30 Mar 2013 05:21:40 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-247828</guid>

					<description><![CDATA[the s/w should measure the webpages downloaded by the browsers and any other content movies, etc that i may download either using torrents or something like wget....
essentially it should display how much data i have downloaded totally....
]]></description>
			<content:encoded><![CDATA[<p>the s/w should measure the webpages downloaded by the browsers and any other content movies, etc that i may download either using torrents or something like wget&#8230;.<br />
essentially it should display how much data i have downloaded totally&#8230;.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Harriet W		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-247745</link>

		<dc:creator><![CDATA[Harriet W]]></dc:creator>
		<pubDate>Fri, 29 Mar 2013 17:31:17 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-247745</guid>

					<description><![CDATA[When was wget (the linux download tool) written?
]]></description>
			<content:encoded><![CDATA[<p>When was wget (the linux download tool) written?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Alex		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-247650</link>

		<dc:creator><![CDATA[Alex]]></dc:creator>
		<pubDate>Thu, 28 Mar 2013 22:14:56 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-247650</guid>

					<description><![CDATA[My upload speed is very slow, downloading a file and uploading it onto my website would take too much time and bandwidth. Is there any way I can download a file from one website onto my own website?  My web host doesn&#039;t allow me SSH/telnet access so I can&#039;t use wget. Any other ideas?
I don&#039;t think FXP will work either unless you can somehow do HTTP to FTP...
I can&#039;t log onto my webserver, at least for a command prompt, such as telnet or ssh. How else would you suggest I log onto my webserver that has the ability to download a file from another website?
]]></description>
			<content:encoded><![CDATA[<p>My upload speed is very slow, downloading a file and uploading it onto my website would take too much time and bandwidth. Is there any way I can download a file from one website onto my own website?  My web host doesn&#8217;t allow me SSH/telnet access so I can&#8217;t use wget. Any other ideas?<br />
I don&#8217;t think FXP will work either unless you can somehow do HTTP to <a href="http://FTP.." rel="nofollow ugc">http://FTP..</a>.<br />
I can&#8217;t log onto my webserver, at least for a command prompt, such as telnet or ssh. How else would you suggest I log onto my webserver that has the ability to download a file from another website?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Andre		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-247563</link>

		<dc:creator><![CDATA[Andre]]></dc:creator>
		<pubDate>Thu, 28 Mar 2013 07:30:46 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-247563</guid>

					<description><![CDATA[i set up a virtual server running on ubuntu server.  i tried apt-get, curl, wget, ftp and scp and none of them will download any packages.  how can i get packages installed?  it doesn&#039;t even have vi.
]]></description>
			<content:encoded><![CDATA[<p>i set up a virtual server running on ubuntu server.  i tried apt-get, curl, wget, ftp and scp and none of them will download any packages.  how can i get packages installed?  it doesn&#8217;t even have vi.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: have faith		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-247034</link>

		<dc:creator><![CDATA[have faith]]></dc:creator>
		<pubDate>Sun, 24 Mar 2013 20:47:27 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-247034</guid>

					<description><![CDATA[ok, everytime me and my husband have relationship i get really wet and my husband doesnt like it. he saids i get really wet that he doesnt enjoy so i would like to know what can i do to no wget wet so much... please help
]]></description>
			<content:encoded><![CDATA[<p>ok, everytime me and my husband have relationship i get really wet and my husband doesnt like it. he saids i get really wet that he doesnt enjoy so i would like to know what can i do to no wget wet so much&#8230; please help</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Cpt Excelsior		</title>
		<link>https://www.krazyworks.com/wget-and-user-agent-header/comment-page-1/#comment-246892</link>

		<dc:creator><![CDATA[Cpt Excelsior]]></dc:creator>
		<pubDate>Sat, 23 Mar 2013 13:28:20 +0000</pubDate>
		<guid isPermaLink="false">http://www.krazyworks.com/?p=1320#comment-246892</guid>

					<description><![CDATA[Is there a way to see a list of all the directorys in the current directory I am on?
So say in the directory i am in I decide to do
mkdir test.
Is there a command that can show that directory in a list? So I can see the &quot;test&quot; directory along with all other in a list? 

Also what is the differnse between 
Yum and wget?
]]></description>
			<content:encoded><![CDATA[<p>Is there a way to see a list of all the directorys in the current directory I am on?<br />
So say in the directory i am in I decide to do<br />
mkdir test.<br />
Is there a command that can show that directory in a list? So I can see the &#8220;test&#8221; directory along with all other in a list? </p>
<p>Also what is the differnse between<br />
Yum and wget?</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
