Bluehost vs Dreamhost

As you might have read in my Migrating to WordPress article, I am now the proud owner of both a Bluehost and Dreamhost account. These two shared hosting providers have similar strong offerings for a similar low price, but they’re nevertheless different. Let’s compare both:

The raw numbers

BLUEHOST.COM
Bluehost
DREAMHOST.COM
Dreamhost
PRICE
$6.95/mon (2 years prepaid) $7.95/mon (2 years prepaid)
FEATURES
  • 10 GB storage
  • 250GB/mon bandwidth
  • 20 GB storage
  • 1000GB/mon bandwidth
  • 6 domains, 20 subdomains
  • 50 MySQL, 50 Postgres DB
  • 2500 email addresses
  • PHP, Perl, Python, Ruby on Rails
  • unlimited domains
  • unlimited MySQL DBs
  • unlimited email addresses
  • PHP, Ruby on Rails
ONE-CLICK Install
CPanel/Fantastico: WordPress, pMachine, Nucleus, Drupal, Joomla, PhpNuke, Typo3, phpBB2, OS Commerce, Coppermine, Gallery, PHPList, Advanced Poll, PHProject, SohoLaunch, PhpWiki, PhpAdsNew, WebCalendar, Moodle, … Home-made: WordPress, phpBB, Advanced Poll, osCommerce, MediaWiki, Joomla, Gallery, WebCalendar

Continue reading Bluehost vs Dreamhost

Double Wifi: municipal wifi with protection

I have written about FON before (they provide a business model for sharing one’s bandwidth through Wifi). They use a custom firmware for the Linksys WRT54G routers. I have the feeling that current Wifi routers (or access points) cannot offer a good balance of security/flexibility. Opening your own network for everyone is currently too dangerous. There’s Wifi trolls that gobble up your bandwidth and there’s hackers that scan your ports for vulnerabilities. My idea is that now you would need 2 Wifi zones, one behind the other, each having different security and different policies. With access points costing as little as 25 euro, that is not a big investment.

I see 2 scenario’s:
Scenario 1: first the public
Double Wifi: first the public

Description
The first router is connected to your broadband and serves the PUBLIC zone (e.g. SSID “FREEWIFI”). On one of the wired Ethernet connections (the Linksys has 4 of those) the other router is connected, that serves the PRIVATE zone (e.g. SSID “PROTECTED”). Both are in a different IP range. The PUBLIC one requires no login, the PRIVATE one requires WPA + maybe MAC address checking.
PRO
* both the Internet and the PUBLIC zone are outside your PRIVATE network, so you can have the same firewall settings for both, and ‘dangerous’ traffic never passes over your INTERNAL network.
* the first router can be configured to prioritize traffic from the fixed ports i.e. the PRIVATE network.
CONTRA
* If the PUBLIC router does not support QoS (Quality of Service) or bandwidth shaping, then a wifi troll can consume all the available bandwidth, and the PRIVATE network is left without anything.
* if the PUBLIC router is broken (or switched off) no one has Internet connection.

Continue reading Double Wifi: municipal wifi with protection

IVI: Internet voor Iedereen

Internet voor iedereenIf your (Belgian) parents or grand-parents want to buy a cheap PC to get started on the Web, tell them to hold back for a couple more days. The Federal Government – through FEDICT – has set up a program to sponsor a complete package of PC + software + broadband + training for a sharp price. The title of the project: IVI or “Internet voor Iedereen” – the launch is planned for next week, April 18th.
Continue reading IVI: Internet voor Iedereen

Digital cinema: movie distribution

I wrote about digital cinema earlier. I want to focus now on the distribution of movies to theatres.

FILESIZE OF A MOVIE

The movie’s video signal is compressed and encrypted into a bitrate of max 250 Mbps, which translates in 31.25 MB/second or 112.50 GB/hour footage. So a ‘short’ 90-minute movie is something like 170GB, and a 2h30 movie, with some audio thrown in, is more like 300 GB. The estimates from the DCI specification are even higher: around 140 GB per hour running length (video, audio and subtitles together) or around 38 MB/s.
movie storage requirements
Continue reading Digital cinema: movie distribution

Broadband in Brussels

(post seems to have disappeared when I migrated to WordPress
I have what is proving to be an expensive habit: I’m subscribed to over 30 podcasts (including e.g. Diggnation at 300MB/week), I regularly download software to try out, I use BitTorrent on a regular basis, I buy stuff on iTunes. All that adds up to more than my allowance my ISP subscription gives me (20GB per month). Most of the months I pay an extra €8 per 10GB.

I’m a Coditel customer (cable provider in the part of Brussels where I live). I started out on ADSL until I got loads of technical problems and Belgacom/Skynet could not solve them. My current bandwidth is not bad (although not as fast as Telenet):

http://www.adslbox.be speed test results:
– Download speed : 4415 kbit/s or 552 kbyte/s
(in theory it should be 10.000 kbit/s)
– Upload speed : 233 kbit/s or 29 kbyte/s
Wed Feb 15 2006 at 20:59:42 UTC+0100

But now I want to know: do I have the best formula? So I collected some data. On vergelijking.be all the provider formulas are listed, but the list is not up-to-date. I collected all the latest numbers from the ISPs’ homepages. I got some real throughput statistics from adslbox.be and ispmonitor.be.

BROADBAND PRICES

Broadband in Belgium: sorted by price
The cheapest broadband one can get is Coditel LightClick: 22,90 € for 1 Mbps. The best price/speed you can get is Telenet ExpressnetTurbo (60€ for a theorethical 20Mbps and an actual throughput of about 11 Mbps). The one to avoid is Belgacom ADSL Light: 30 € for a meager 0,5 Mbps.

COST OF 50GB/MON

Now let’s see what happens if I would go to 50GB data transfer per month. Only 6 providers allow for this, either because their GB/mon allowance is big enough or because the price per extra GB is acceptable:
Broadband in Belgium: cost of 50GB/mon
Where I live, I cannot get BruTele or Chello, so the only options are Coditel (cable) and Dommel, RealDSL or Mobistar (ADSL).

GB/MON ALLOWANCE

Broandband in Belgium: GB/mon allowance
There are a 6 broadband subscriptions that allow unlimited download allowances: Dommel Netconnect Pro, BruTele @Home and @Turbo, RealDSL Basic and GeekDSL and Chello Extreme. The only options for me in Brussels are the 2 ADSL ones, of which the Dommel one is excessively expensive (€150/month).

CONCLUSION

For Brussels, Coditel is still a very good option (now if they could update that empty FAQ page that doesn’t seem to have been updated since the nineties). There is no point in switching from SpeedClick to MegaClick for the GB/mon only, but if the speed is really the double (in theory 20Mbps instead of 10Mbps) it might be a nice upgrade. I don’t need something like 200GB/month yet, but if I would, then RealDSL would be the best option.

If you live in Flanders and your main concern is speed, go Telenet ExpressNet Turbo. If you need loads of GB/month, go Chello Extreme (where possible) or RealDSL.

UPDATE
RealDSL does NOT accept any new subscriptions since October 2005, since there seem to be a capacity problem with its bandwidth provider Telenet. Luc and Cindy blogged about this earlier, so I have no excuse for my sloppy research. The only DSL provider with a unlimited bandwidth offer seems to be Dommel, but at an extreme high price. Their €33/50GB is however a good offer. Thanks for the update, Smetty!

Filling a terabyte iPod

Muster said that within five years, Apple could release an iPod with one terabyte of storage — that’s almost 17 times the maximum amount of iPod storage Apple currently offers.
Munster envisions a one terabyte iPod as a portable, “coffee table” media center that would allow users to store hundreds of movies and thousands of photos and songs.
cnn.com

A 1000 GB iPod, that is

  • 200 movies or 370 hours of full quality DVD
  • up to 2000 hours (almost 3 months non-stop) at DivX/Xvid/MPEG-4 quality
  • using the H.264 video compression: 120 days or 4 months of video!
  • 1500 music albums of full quality CD (which means, no Sony XCP)
  • 15.000 albums if your rip/compress them to MP3 first, maybe 20.000 if you use WMA/AAC (that is over 2 years of audio to listen to!)
  • 2500 episodes or 100 seasons of TV series like Lost, L-Word, Desperate Housewives, Sopranos, … in compressed format (hey, it’s a 2,5″ screen, who cares about HD?)
  • If your terabyte iPod breaks down and you buy a new one, it will take you between 3 hours (FireWire 800 Mbps) to 2 days (Wifi 802.11g) to fill it up again (from the backup you of course had put on your snug little home 10GB RAID-5 storage cluster thingy).
    If by then all portable devices have 10-Gbit Ethernet built in: 15 minutes will be enough to fill ‘er up.
  • Our then-standard 48 megapixel camera would create 72MB RAW images, of which the iPod could store 14.000, or if you would compress them to 5MB JPEG: 200.000 pictures.

Other predictions: the iPhone (or Apple as mobile virtual network operator) and the iTIVO, a media-center/time-shifting/TV/video/DVD hub , all in the next 12-24 months. Let’s hope this inspires some people to seriously vamp up their design/user interface teams (Nokia, Microsoft, I’m looking at you!).

Technorati:

Know Your (Metric) Limits


From Wired – July 2004:

The universe comes in a box. It’s a big box, and you almost never see the walls, but its boundaries are immovable – the speed of light, gravity, the way atoms interact. Even if time and space are unlimited and illimitable, physics, chemistry, and biology dictate maxima and minima in the universe. Like the strict meter and structure of a sonnet, they make the final product all the more beautiful. – Adam Rogers

5 billion Years – Maximum time Earth has left.

That’s when the sun goes red giant and expands past Earth’s orbit.

5.4 * 10-44 seconds – Shortest possible time.

Any shorter and quantum mechanics can’t tell whether events are simultaneous.

1.419 * 1026 meter (15 billion light-years) – Maximum distance we can see.

The universe is about 15 billion years old – this is light’s travel time.

1.6256 * 10-35 meter (6.4 * 10-34 inches) – Shortest possible distance.

Planck length: any shorter and quantum mechanics can’t tell between here and there.

34.92 km (21.7 miles) – Maximum height of a mountain on Earth.

Uplift reaches equilibrium with pressure at the base.

3.048 * 10-7 m (1.2 * 10-5 inches) – Minimum size of an actively growing cell.

Free-living cells need room for a full genome, proteins, and guts.

130 m (427 feet) – Maximum height for a tree on Earth.

Gravity overcomes surface tension in the plant’s circulatory system.

265 – Minimum number of protein-coding genes for life.

As seen in the smallest known single-cell organism.

200 million years – Maximum age of sub-oceanic crust.

Older than that: it cools, becomes denser, and “subducts” back into magma.

-273.15 ° Celsius (-459.67 ° Fahrenheit) – Minimum possible temperature.

Heat is a function of molecular motion, which stops at absolute zero.

338 km/h (210 MPH) – Maximum wind speed for an Earth hurricane.

A storm can acquire only so much energy from the sea.

0.24 second – Minimum delay of a signal sent via geosynchronous satellite.

It’s light speed up 35.600 km (22.300 miles), and back down.

430.000 Mbps – Maximum speed to record data to magnetic media.

Bits won’t flip reliably with a pulse under 2.3 picoseconds.

100 Tbps – Maximum information bandwidth over optical fiber.

Higher power levels mash signals together.

1051 operations per second – Maximum computational power.

Quantum rules won’t let the ideal 1-liter, 1-kilogram laptop crunch data any faster.

Contributors: Sunny Bains, Thomas Hayden, Greta Lorge, Michael Myser, and Boyce Rensberger / Sources: Fire in the Mind: Science, Faith, and the Search for Order (Knopf, 1995); Institute for Genomic Research; Lucent Technologies; MIT; NASA; National Institute of Standards and Technology; Nature; UC Berkeley; Woods Hole Oceanographic Institution; Yale

via Andrew Ferguson and bytehead.org

Technorati:

Podcast hosting: cheap or free?


Podcasting is a fun hobby, but leaves you with several tens to hundreds of megabytes of MP3 files to host. If your podcast turns out to be popular, you might also have over 20GB of file downloads per month (‘bandwidth’). This rules out any free hosting option like Geocities or even your local ISP. What are the other options?

CCPublisher:

free
Creative Commons, together with Archive.org, offer you the option to host your content for free. This is directed towards CC-licensed or open-source audio, so your own speech or your own music. Don’t use it to host illegal/copyright-troubled content.

idisk.mac.com:

$100/year (or $8.5/month)
if you’re already a subscriber to Apple’s .Mac program, this is an easy option. It is not the fastest or most reliable option.

libsyn.com:

starts at $5/month (up to $30)
built for podcasting: based on the #MB you add per month, not on the #GB downloaded per month (so the cost is predictable). Has detailed statistics (although some graphics would be nice). “Liberated Syndication is podcasting made easy”

bluehost.com:

$6.95/month (2-year subscription)
2 GB storage, 75 GB/month bandwidth. Is a general purpose hoster, so if you want to add the actual podcast blog to it, you can (you can add a WordPress blog through the Fantastico interface)

EV1Servers VPS:

$39/month
for the bigger fish: 10GB of storage, 100GB/month bandwidth. If even this is not enough, you can go up to a $99/mon fully dedicated server: 60GB storage, 1000GB bandwidth.

For up-to-date information, keep an eye on the podcasters Yahoo! group.

Technorati:

CD-to-MP3 ripping speed estimation

As every sensible car-owner in Brussels, I rip my CDs to MP3 so I can put copies of them in my car. As every self-respecting geek, I have multiple PCs at home. Which brings me to following observation: not all PCs rip alike. On one PC the CPU maxes out at 100% for the whole ripping procedure, and on the other, I never get above 75%. So I started wondering: what are the elements to define the maximum ripping speed you can get on a PC?
My hunch:

the CD-ROM drive speed:

the original CD audio specification required a constant data rate. This was implemented by running the CD at 500 rpm for the first/inner tracks on the CD (ø 48mm) and at 200 rpm for the outer tracks (ø 118mm). If the CD would have been played at a constant 500 rpm, the data rate at the end would have been 500/200 = 2,5X. (cf Devnulled: Ripping speed)
With CD-ROM the data should be delivered as fast as possible. So the rotation speed is turned up as much as possible. The physical boundaries are the vibrations and the centrifugal forces that occur at high speeds. Maxwell claims the maximum safe speed is 48X. Since the “48X” is marketing speak, this speed is only obtained at the outer border of the CD: this means that the rotation speed would be 48 x 200 = 9200 rpm. Some CDs seem to explode above 10.000 rpm.
To convert this speed into a data rate: at 9200 rpm, the outer tracks would deliver 48x the data rate of an audio CD: 67,74 Mbps or 8.47 MB/s. The first tracks, at ø 48mm, deliver data 2,5 times slower: 27,52 Mbps or 3,44 MB/s.
Real-life tests of a whole bunch of drives on DAE speed results.
For the exact sizes: CD-R/CD-RW technical specifications

the bus speeds:

the CD-ROM drive is connected to the PC by a ATAPI, SCSI, FireWire or USB connection. In theory there could also be a network in between (e.g. when using a Ethernet connected CD Jukebox).
The slowest ATA-33 has a theoretical max throughput of 33MB/s. Most modern SCSIs go above 20MB/s and FireWire gives 50 MB/s. So they would not be the bottleneck in the ripping process.
USB1.1 is limited to 1,5 MB/s (in practice even lower). Most common networks would be a bottleneck too (even Fast Ethernet at a theoretical 12,5 MB/s since 7MB/s would be more of a realistic top rate in practice, certainly if the network is used for other stuff too. Same thing with WiFi standards: 802.11g’s advertised “54Mbps” will in real life never translate in an actual 6,75MB/s throughput.

the CPU speed:

encoding raw audio data to MP3 is CPU intensive. Main parameter will be the clock speed – which I would expect to scale linearly: a 2GHz processor does it twice as fast as a 1GHz. Extra influences: brand of processor (Intel/AMD), model (Celeron/Pentium4/Athlon/Athlon64), number of processors (or HyperThreading). Also, the software you use to encode (LAME/GOGO/RealPlayer/Windows Media Player/…) will have an impact.
Some data can be found on GamePC.com: an Intel P4 3.06 GHz encodes 200MB of raw data info 160 kbps MP3 in 57 seconds: 3,5 MB/s or 20X. The AMD AthlonXP 2700+: 3,28 MB/s or 18.6. More info on GamePC.com confirms our hunch that performance scales linearly with clock speed. For the Pentium4: (1,15 MB/s) per GHz or 6,5X per GHz.

the MP3 bitrate:

the above numbers are for 160 kbps, but what with 192 kbps and 64 kbps? Is encoding faster or slower? I found no data on the net, and I haven’t tested it myself. So no hunch here. Also, the output of the encoding process, even at a very high quality 320kbps is largely within the capacity of any output, even Bluetooth, god forbid. So I don’t take that parameter into account.


So in the following situation:

  • a 24X CD-ROM drive
  • a Pentium 4 2,8GHz processor
  • ripping with the LAME encoder to 160 kbps

Your ripping will start at about 9,8X and speed up until your CPU is saturated at 18,2X. Which gives the graphic at the right. Now there’s a rule of fist.

Remark: looking at the benchmarks, adding a second processor (or HyperThreading) does not enhance the ripping speed (probably since the MP3 encoding code does not do parallelisation). But if you have 2 CPU’s, only one CPU will go to 100% and you keep some breathing room while your PC is creating the MP3s.

Binary confusion: kilobytes and kibibytes

When I created my Bandwidth Calculator, easily the most popular web tool I ever made, I came across the following problem: in computer technology there is a habit of using kilobyte (KB) as 1024 bytes, megabyte (MB) as 1024*1024 (1.048.576) bytes. Most of you might think this is correct, but it’s not. The International System of Units (SI) (that defines the kilo, mega, giga, … and milli, micro, nano prefixes) uses only base 10 values. A kilo is always 1000, even for bytes. In order to find a solution for the IT ‘contamination’ of using kilo for 210 instead of 103, the IEC introduced new units in 1998:

In 1999, the International Electrotechnical Commission (IEC) published Amendment 2 to “IEC 60027-2: Letter symbols to be used in electrical technology – Part 2: Telecommunications and electronics”;. This standard, which had been approved in 1998, introduced the prefixes kibi-, mebi-, gibi-, tebi-, pebi-, exbi-, to be used in specifying binary multiples of a quantity. The names come from the first two letters of the original SI prefixes followed by bi which is short for “binary”. It also clarifies that, from the point of view of the IEC, the SI prefixes only have their base-10 meaning and never have a base-2 meaning.
(from en.wikipedia.org)

So this is the correct usage for file, disk, memory size:

Kilobytes (KB) 1.000 Kibibyte (KiB) 1024
Megabyte (MB) 1.000 ^ 2 Mebibyte (MiB) 1024 ^ 2
Gigabyte (GB) 1.000 ^ 3 Gibibyte (GiB) 1024 ^ 3
Terabyte (TB) 1.000 ^ 4 Tebibyte (TiB) 1024 ^ 4
Petabyte (PB) 1.000 ^ 5 Pebibyte (PiB) 1024 ^ 5

The problem is: the industry has not adopted these standards. If Windows shows the size of a disk, it converts 28.735.078.400 bytes to “26.7 GB”. It should be either 28.7 GB, or 26.7 GiB. Remember the 1.44MB floppy? It actually never existed: it is either 1.40MiB or 1.47MB.

On September 18 2003 Reuters has reported that Apple, Dell, Gateway, Hewlett-Packard, IBM, Sharp, Sony and Toshiba have been sued in a class-action suit in Los Angeles Superior Court for “deceiving” the true capacity of their hard drives. This of course was due to ambiguity of “GB” when used by software and hardware vendors. This precedent might prompt Apple to adapt binary prefixes in its Mac OS, as well as other companies to put pressure on Microsoft to adapt them in its Windows operating systems.
from members.optus.net

One could argue: people have always used the MB = 1024*1024 for disk drives, why change now? Well, clarity is a good reason, and unambiguity. NASA lost the Mars Orbiter because engineers had mixed metric speed (km/h) with English speed (mi/h). Don’t even get me started on miles per gallon.

So: a disk of 160GB should have 160.000.000.000 bytes. And it is about 150GiB. Get over it.