Ten Terabytes Transferred
Posted by Michael on 4 December 2004, 04:19 GMT
December 4, 2004 is a date that will be remembered around the world forever. Parades will be held in the streets of New York, Paris, and Pago Pago. Future children will celebrate each year with their parents, holding wild calculator festivals. Why? Because this evening ticalc.org passed further into the realm of undeniable awesomeness with a major bandwidth milestone.
During the period from April 14, 1997 to December 4, 2004, ticalc.org transferred 10 TB of data. That's ten terabytes (equivalent to 10,240 GB). We are currently averaging 6.12 GB of data transferred per day, or 42.82 GB per week. You can find all of these statistics and others on our Web Server Statistics page.
A few other interesting pieces of trivia:
- Hypothetically, it would take a TI-83 Plus approximately 291 years to transfer 10 TB through the link port. If you used TI Connect, it'd probably take a millennium and then crash halfway through.
- The average video rental store holds about 8 TB worth of video data.
- It is estimated that the books of the Library of Congress would amount to 20 TB if digitized. Thus, ticalc.org has transferred "half a Library of Congress".
We wish to thank all of the people who have visited ticalc.org and contributed to this milestone.
|
|
Reply to this article
|
The comments below are written by ticalc.org visitors. Their views are not necessarily those of ticalc.org, and ticalc.org takes no responsibility for their content.
|
|
All Hail Ticalc.org
|
anthony C
|
Hip, Hip, Horrrayyyy!! Hip, Hip, Horayyyy!! Hip, Hip, Horrayyyy!!
(Grabs large, nice cold soda and sits down to computer to download every file)
|
Reply to this comment
|
4 December 2004, 05:52 GMT
|
|
Obligatory /. "humor"
|
Barrett Anderson
(Web Page)
|
6.12 GB??? what is that? can you put that into perspective for me? how many libraries of congress would that be equivalent to?
|
Reply to this comment
|
4 December 2004, 10:15 GMT
|
|
Go Googlebot!
|
nicklaszlo
(Web Page)
|
How much of this transfer was due to bots? Has googlebot crawl expansion excelerated bandwidth usage lately?
I know ArXiv.org, which has a much larger database also based on HTML, has had problems with bots using to much bandwidth. Their approach has been to attack back. (link)
Funny quote:
"Presumably you neither would be terribly thrilled if every aspiring encyclopedia editor were to send a gang of blind 600 lb gorillas to your library, armed with a photocopy machine."
|
Reply to this comment
|
4 December 2004, 14:30 GMT
|
|
1 2 3 4 5 6
You can change the number of comments per page in Account Preferences.
|