Sunday, March 24, 2013

Review: Visual C# 2008 in 24 Hours

Sams Teach Yourself Visual C# 2008 in 24 Hours

Even though were a few months into 2013 and Visual Studio 2012 has been out for a year, I picked up James Foxall's Visual C# 2008 as a quick way to start working with C# for a job-related assignment. I have a ton of C, C++, and Java experience, but no experience with C# and I haven't worked with the Visual Studio IDE's in quite a while. I needed a jump start into the language and IDE, and that's what I got.

One concern I had was how relevant the book would be, considering that it is five years old and is specific to a two-versions-old edition of Visual Studio. It turns out that it wasn't a problem at all. The visual style of the IDE has changed a little, but the locations and names of the elements have not. The code samples all compiled and ran without modification. In one of the later chapters, he covers code automation using Microsoft Word and Excel. He coded for Office 2007, but I ran these samples against Office 2010 without a problem.

One possible age-related problem is that the link to the sample code given in the text is no longer correct. You can find it at both http://www.informit.com/content/images/9780672329845/examples/Examples.zip and http://www.jamesfoxall.com/downloads/TYCS2008samples.zip. With the exception of a couple of images used in the projects, I didn't find this file very helpful. After all, learning to code is not a spectator sport and if you're just copying code samples, then you're not doing it right.

As for the book, it's fairly well laid out and it does quite a bit of hand holding. It assumes very little coding experience. As is the case with all of the books in Sams Teach Yourself ___ in 24 Hours series, it is split up into 24 chapters, or "hours". Not a single hour took me the full hour to complete. Some took as little as fifteen minutes and most averaged around 30-45 minutes.

There is a good deal of breadth in the text (but not a lot of depth). The book covers Visual Studio and designing applications graphically, objected oriented programming, basic programming techniques (like if-statements and for-loops), manipulating the registry, managing files, working with databases, controlling other applications with automation, and creating an installer. Of course, none of these subjects is covered in much detail. For example, the database chapter assumes that someone else has designed a database for you; the text only shows you how to pull records out and insert new ones.

For me, the book was exactly what I needed. An experienced programmer who wants a jump-start into C# development will find what they need here. If this is you, I would suggest that you just skip chapters 10 through 14; oddly he places the basic programming techniques here in the middle of the text. This book is supposedly geared toward new programmers, but I think being thrown into the deep end for the first 200 pages and only then being told what a for-loop is would be confusing. It's an odd choice and for that reason, I wouldn't recommend this book if you don't already know the basics.


Tuesday, March 19, 2013

Self-Destructoid

The gaming website Destructoid published a piece last week where their founder, Niero Gonzalez, bemoaned the loss of revenue due to ad-blockers. His basic argument was that the future of his site and sites like his are in danger because so many users are blocking all ads by default that they can no longer receive enough revenue to keep in business.

Micropayments have never really taken off and subscriptions only seem to work if you have a major news website like the New York Times or Wall Street Journal. For the smaller players, it seems that ads are the only realistic source of revenue. And if too many users start to use ad-block, then you're surely going to be driven out of business.

Niero bases most of his argument on data collected by a third-party tool called BlockMetrics and it showed that 42-46% of his viewers are blocking ads. BlockMetrics also allows you to enter your current CPM (which is a measure of the revenue you get per 1000 ads shown) and then it calculates how much money you have lost on non-ad-viewing readers. For example, if you show 100,000 ads daily at $1.20 CPM, you're making $120 per day. And if 45% of your readers have ad-block enabled, then BlockMetrics would calculate that you could be showing 181,800 ads daily and therefore you're effectively losing $98.18 in revenue every day.

Lets imagine for a moment that there was a magic switch and Niero could force all of his users to see ads. On day 1, he would get nearly a doubling of ad revenue, but things would surely change from there.

Websites sell ads on the basis of the number of views. Advertisers buy ads on the basis that they will cause people to buy their product. There is a disconnect between what the website is selling and what the advertiser is buying and this is resolved through the CPM. Websites that can drive clicks (and ultimately purchases) to a product can charge a higher CPM; ones that can't watch their CPM fall. If Niero was able to force the 45% of his audience who doesn't want to see ads to see them, would these people really click on ads at the same rate that the non-ad-blockers do? They would click some, but it only stands to reason that the rate would be somewhat less, which would drive down his CPM. Showing ads to people who don't want to see them is not going to lead to more sales for the advertiser.

The other consideration is that the people who are blocking ads are doing so because ads were so repulsive to them that they were willing to spend time and effort to find, download, install, and setup an ad-blocker. Admittedly, if you know what you're doing, this doesn't take even five minutes, but still these people were bothered enough to do it. If they are forced to view ads on one website, but not on another, which site are they going to view? Gaming websites, like Destructoid, are numerous and if you don't like one, it's not hard to find another. If Niero forced his audience to view ads, then his audience would surely shrink. And since most traffic to websites is driven by people sharing on Facebook, tweeting on Twitter, posting to links on blogs, et cetera, the loss of the ad-blocking audience would surely result in some drop-off in the non-ad-blocking audience as well.

Unfortunately, BlockMetrics data only shows you part of the picture and it makes it seem like you're leaving money on the table. No one likes to think that half of their income is being stolen. But the truth is that an ad-block rate of 0% would be mostly canceled out with a decrease in CPM and a decrease in readership. Without ad-blockers, Destructoid's revenue might go up, but it would be a modest increase at best.

My advice? Don't go chasing every nickle. You'll make yourself mad trying to do so. It's like Google's Amit Singhal said last week about SEO: "If you build high-quality content that adds value, and your readers and your users seek you out, then you don't need to worry about anything else." Niero should focus on making his content the best it can be and the advertising revenue is sure to follow.

Sunday, March 17, 2013

Review of TP-Link 200Mbps Powerline Ethernet Starter Kit

Today, I'm going to post a quick review of the TP-Link TL-PA2010KIT AV 200Mbps Nano Powerline Adapter Starter Kit. A video review I made follows and it includes some setup instructions and other details not in this review, so its worth watching.

TP-Link Nano Powerline Ethernet Adapters
Pros: In the box are the two adapters and two 2-meter cables. Take each adapter, plug it into the wall, plug the cable into it and the computer or router and you're done. A lot of devices say "get started in seconds", but for this one, it's really true. There are no configuration settings to play with. The units are glossy white, fairly small, and they don't block the other plug. The pins are unpolarized, so you can plug it in upside-down - this lets you plug both units into the same outlet (but there's no practical reason to do this, other than for testing). The front of the unit has a synchronization button, if the units ever become unpaired. There are three blue lights on the front to help you diagnose a problem, if you ever have one.

It works flawlessly with Windows and Linux and the computers see the cable as they would any other Ethernet cable - the computer has no idea that its using powerline adapters.

Cons: This device will not work on an outlet with a surge protector. I tried that initially and I was getting around 80% packet loss.

Other Thoughts: The advertised speed on these is 200Mbps, which is 25MBps. Ethernet uses 5/4 encoding, which cuts the practical speed to 20MBps. But that's duplex and in most cases you'll only be transferring a file one direction at a time, so you really should expect to see 10MBps max. With me so far? If I have both units plugged into the same outlet, I get a file transfer rate of about 9.2MBps, pretty close to the theoretical max. With the units plugged into outlets on the opposite ends of my house, then I get 3.6MBps, which is still fairly fast. Pings were comparable to wireless pings, hovering around 1.5ms from computer to router across the house.



Saturday, March 16, 2013

How many people are on the Internet?

How many people are on the Internet? It seems like it should be an easy on to answer, but it's deceptively simple. However, attempting to answer it illustrates many of the problems in making supposedly simple measurements.

First, the world's population is a bit over 7 billion people, so this establishes a nice upper bound. No matter what, we know that the answer lays between zero and 7 billion.

Now, every device on the Internet has a unique IP address associated to it. It should be a simple matter to count up the number of addresses and get a better estimate. However, this is the first problem: IP addresses are unique to devices, not people. Two people sharing a computer counts as one IP address and one person with a cell phone, computer, tablet, and Internet-connected TV counts as four. However, lets ignore that problem for the moment.

Most IP addresses use what's called IPv4 and currently IPv4 addresses are "exhausted"; in other words, every address has been claimed by someone. IPv6 is the newest addressing system, but its usage is fairly small, so we'll ignore it. IPv4 addresses are a 32-bit number, so the number of addresses is 2^32 = 4,294,967,296 or approximately 4.3 billion. That means that there is an upper limit of 4.3 billion devices on the Internet.

Not all IPv4 addresses are in use due to inefficiencies in the distribution system. The first users of the Internet were given large blocks of addresses for them to dole out to their users, however many of these initial blocks were extremely large. For example, Stanford University was allocated 16.7 million addresses. That's a bit of overkill for a university with less than 20,000 people on campus. They later returned the unused addresses, but these sorts of inefficiencies remain at a smaller scale. For example, many ISPs maintain smaller blocks of unused addresses so that they can give them to new customers. With this in mind, the 4.3 billion device count was an overestimate.

However, Network Address Translation (NAT) can cause underestimates. NAT allows many devices to share one IP address. Most home users have one IP address for their Internet connection and every Internet-connected device (computer, laptop, tablet, cellphone, TV, Bluray, etc) shares that IP address using NAT. So, our 4.3 billion device count is also an underestimate.

So, the number of people on the Internet is somewhere around 4 billion. Possibly more due to people sharing devices and NAT. Possibly less due to people with multiple devices and unused IP addresses.

But what got me thinking about this was a comment from a video game publisher about the number of pirated copies of their latest game. If we can't even figure out reasonably well how many people are on the Internet, how can we possibly answer more sophisticated questions like how many pirated copies of a particular game are there?