Saturday, July 5, 2014

Rental House

If you're here, it's most likely because you saw my ad for a house for rent.

In order to prevent scammers from stealing my photos and using them for rental scams. Please see Craiglist's Scams page for more information on typical types of scams.

Below are links to my current rental ads. If you got here from any other page, then most likely, you are seeing a scam ad.

Current Links:


Wednesday, March 26, 2014

Why the Windows XP Support Deadline Doesn't Matter


On April 8th, Microsoft is discontinuing support for Windows XP. XP is still used on about 30% of the PCs in the world and Microsoft has been working hard to push people to Windows 8.1 under the threat of “harmful viruses, spyware, and other malicious software” (http://windows.microsoft.com/en-us/windows/help/what-does-end-of-support-mean).

Despite this, most people aren't budging. Windows XP is comfortable and it works. And Windows 8.1 is infamous for it's Metro/Modern interface which is generally reviled. This has lead to many a tech commentator throwing out doomsday scenarios regarding the end of support. But I'd like to take a minute and debunk these:


It's the Most Patched Operating System of All Time

Windows XP was originally released in October 2001, so April will mark 12.5 years of patches and fixes. The last service pack, SP3, was released in May 2008, which marks the last time that XP got new features and new opportunities for bugs. It now has nearly 6 years worth of patches since then.
With all of these patches and fixes, XP is actually a pretty secure system. The code is mature, tested, and in wide use. Security researchers have been looking for weaknesses for over a decade and as has been said, given enough eyeballs, all bugs will be found.


It's Not that Great of a Target

At this point, there are basically two types of machines that are running XP: those who can't upgrade and those who won't. The ''those who won't” are the home and small business users who are happy with things the way they are and just don't want to upgrade. Their machines tend to be older with less CPU, hard disk, and Internet resources. In other words, there's not much worth exploiting. These computers may up to a decade old and so obsolete that even if compromised, there isn't much that can be done with them.

The second type, “those who can't” are those who are using XP in very custom applications, such as ATM's. An ATM sounds like a great target to attack, except when you realize that these machines are not accessible via the Internet, that banks have their own private security teams looking for suspicious behavior, and that they are paying Microsoft to continue support for them privately. Even though the operating system itself may be weak, the security layers around it are very strong.

Without a good target to go after, there's not much of a point in working to exploit XP. A Windows 7 exploit, with a greater user base and newer hardware is a much better target.


The End is Already Here

With less than two weeks before the April 8th deadline, now is the time when a smart criminal would release a hack that exploits an unknown weakness. At this point, even if a new threat emerges, there is not enough time left for Microsoft to find the code at fault, fix the bug, test a patch, and then release it.

If criminals have any previously unknown exploits, then we would be seeing them released into the wild now, because like all other businesses, there is a first mover advantage. The longer they wait, the higher the probability that someone else will release malware using the same exploit first. The fact that we haven't seen any such hacks suggests that there aren't any known major weaknesses in XP.



The Actual Deadline is May 13th

Microsoft releases patches on the second Tuesday of every month and if Microsoft weren't discontinuing support for XP, then the next set of patches would be released on May 13th. That's the first date when there would have possibly been patches, but now there won't be. Up until that date, the support situation for XP is identical to the way it would be if Microsoft were continuing support.


Support from Everyone Else is Continuing

Google has announced ongoing support for Chrome on XP. So have most antivirus manufacturers. Most software on XP will continue to see updates for the time being as well. Even if a major hole is found in XP, keeping the rest of the software on the PC up-to-date will help mitigate the risk.


Conclusion

The moral of the story is that sooner or later, you should upgrade. Whether it be to Windows 8.1, a version of Linux, or a Mac, it doesn't matter. However, there is no rule that says that it has to be done by April 8th. So make an informed decision, weigh the benefits and risks, and then upgrade when you're ready to.

Friday, March 21, 2014

Amazon Prime and ShopRunner

If you've ever done any online shopping, you're probably already familiar with Amazon's Prime shipping service. For $99/year (previously $79 until mid-March 2014), most of your Amazon orders are delivered within two days for free. You also get some other benefits with your membership, such as access to a lot of free TV shows and movies through their Amazon Instant Video streaming service.

Amazon Prime is a great service. While you're shopping, Amazon lets you know how much longer you have to place your order in order to make that day's cutoff time ("place your order in the next 53 minutes to get this order by Thursday"). On the final page of the ordering process, Amazon tells you exactly when the order will be delivered.

Shipping occurs Monday-Saturday; Sunday and holidays don't see deliveries. Even so, if you place an order on Sunday, it usually counts as a shipping day, so that you still receive your package on Tuesday.

In my experience with Prime, I've only had a couple of packages take three days (excluding Sundays and holidays). My best order was of a package of specialty pens that only local art stores and Staples carry. It was 11pm on a Friday night; the local stores were closed and the first wouldn't open until 9am the next morning. I placed my order and at 8:30am the next morning, a courier dropped off the package from Amazon. Not only did they get me my 2 day order within 10 hours, but they actually got me my order faster than I could have picked it up from a local store.

ShopRunner positions itself as an Amazon Prime type service for the rest of the web. The pricing is similar ($79/year) and it enables you to get free two day shipping from nearly 100 online stores. In principle, it should be just as awesome as Prime; in practice, not so much.

The first problem with ShopRunner is the poor integration between it and the partner stores. You have no indication from the store when your order needs to be placed, nor do you get a delivery estimate when you place your order.

You also don't get two day shipping. You get delivery within two days. The distinction is important. If the store determines that UPS 3 Day Select is likely to be delivered within two days, then they can use that service. However, UPS only guarantees that the order will be delivered within three days, so I've had many of my ShopRunner orders delayed.

But the biggest problem is ShopRunner's definition of "2-Day shipping". First, the order must be placed by that store's "cut-off time" in order to ship the same day. All of the times are different and the earliest ones are 7am PST. ShopRunner also doesn't consider Saturdays, Sundays, and holidays to be shipping days.

Consider the following example: you order a new USB flash drive from NewEgg.com at 10am on Friday (NewEgg's cutoff is 9am PST). The order won't ship on Saturday or Sunday. If Monday is a holiday, then it won't ship then either. On Tuesday, NewEgg actually ships your order using UPS 3 Day Select. You finally receive your order on Friday afternoon, more than a week after you ordered it. This example actually happened to me. Had I ordered from Amazon, there is a good chance I would have received my package the next day.

Another problem is that the stores aren't even aware that they offer this service. I ordered a pizza from Dominos; ShopRunner waives the delivery charge ($2.50 in my area). I had signed in with ShopRunner, but for some reason it was not applied to my order. So I called the store. The person who answered had no clue what I was talking about, so she got the store manager. The manager also had never heard about ShopRunner. Even worse, the manager was acting like I was trying to scam her into free delivery; she wouldn't even check their own website to verify what I was saying. I ended up canceling my order and went out for lunch instead.

On the plus side, if you have an American Express card, you get a complementary subscription to ShopRunner. And if you have Amazon Prime, ShopRunner will give you a one-year free subscription.

You can find out more about both services here:

Amazon Prime
ShopRunner

Just make sure you understand the benefits and limitations of each before you sign up.

Wednesday, July 31, 2013

Nexus and Android Benchmarks - Part III - 2012 vs. 2013 Nexus 7's

In part I, I compared the 2012 Nexus 7 before and after a reboot. In part II, I compared the 2012 Nexus 7 running Android 4.2 and with the same tablet running Android 4.3.

Today, I compare the 2012 Nexus 7 running Android 4.3 with the brand new Nexus 7.

The new Nexus 7 features a new CPU and GPU, a refined design, the latest Android version, both front and rear cameras, and a bunch of other improvements. Personally, I'm impressed with how good it feels to hold; the slightly narrower bezel really does make a huge difference. But, I digress... onto the benchmarks!

For these tests, I will be using the AnTuTu benchmark app. For each configuration, I ran the benchmark ten times. For the original Nexus 7, the average of the benchmark results was 11,977 with a standard deviation of 178. The new Nexus 7 scored an impressive average of 20,249 with a standard deviation of 240. The new Nexus 7 is clearly much faster!

As for the subsystem sections of the results, the CPU scored on average 28% faster. The RAM was 42% faster and the I/O was 7% faster. But the GPU really took the crown with performance that was 169% faster. I could see the difference in the graphics benchmarks: whereas before, a couple of the tests had such low framerates that it looked like a slideshow, now those same tests are much smoother.

It's not strictly necessary in a case like this where the performance is clearly improved, but for fun I went ahead and performed a t-test, which is a statistical method for determining how much of the difference between two sets of measurements is attributable to the variable that changed. The average difference between the two trials was 8272 (with a standard deviation of 295) or roughly a 69% performance increase. And the t-test confirmed that the >8000 point difference was due to the actual difference between the tablets (and not just statistical noise) with a >99% probability.

Sunday, July 28, 2013

Nexus and Android Benchmarks - Part II

Part I is here.

In this post, I'm going to compare the 2012 Nexus 7 with Android 4.2 to itself with Android 4.3. Android 4.3 offers a few performance enhancements, most relating to OpenGL. Specifically, the new version includes OpenGL ES 3.0. This benchmark only runs ES 2.0 tests, but I'm hoping to see at least some improvement in the GPU tests.

For each configuration, I ran the benchmark ten times. For the Nexus 7 with Android 4.2, the average of the benchmark results was 12,019 with a standard deviation of 134. After installing Android 4.3, the average was 11,977 with a standard deviation of 178. The results are similar, but the new version scores slightly worse.

What I did then was performed a t-test, which is a statistical method for determining how much of the difference between two sets of measurements is attributable to the variable that changed. In this case, I wanted to see how much of the change in performance was attributable to Android 4.3, as opposed to statistical variation in the measurements.

The average difference between the two trials was -42 (with a standard deviation of 141) or roughly a .5% performance decrease.  However, the t-test gave a result of <1, which is completely negligible.

The takeaway message is that you're not going to see an improvement performance-wise in going from Android 4.2 to 4.3. Instead, the new version appears to focus more on new features, such as restricted profiles and OpenGL ES 3.0.






Saturday, July 27, 2013

Nexus and Android Benchmarks - Part I

In a three-part series, I'm going to compare the relative performance of the original and new Google Nexus 7 tablets. The new Nexus 7 comes with Android 4.3; the 2012 Nexus 7 had Android 4.2 on it, but these tablets will be getting an over-the-air (OTA) update during the next couple of weeks. Unlike most other comparisons you'll see, I spent a little extra time to try and extract some statistically relevant data.

For these tests, I will be using the AnTuTu benchmark app. Why this app? Well, it breaks the results into CPU, GPU, Ram, and I/O, so you can see how the different components of the system are performing. Also, it's a free download and it has a good reputation for producing reliable benchmark results.

In this post, I'm going to compare the 2012 Nexus 7 with Android 4.2 to itself, before rebooting and after. Before rebooting, the tablet had been on and in daily use for over a month. One difference between iOS and Android devices is in the way they handle multitasking and what I wanted to see was if having other apps running in the background resulted in a drop in performance.

For each configuration, I ran the benchmark ten times. For the Nexus 7 before rebooting, the average of the benchmark results was 11,910 with a standard deviation of 100. After rebooting, the average was 12,019 with a standard deviation of 134. So, they look comparable, but the reboot seems to have helped a bit.

What I did then was performed a t-test, which is a statistical method for determining how much of the difference between two sets of measurements is attributable to the variable that changed. The example that is typically given is that if you have two sets of cancer patients and you give one set an experimental drug, you want to tell how much of their improvement is attributable to the drug. In this case, I wanted to see how much of the change in performance was attributable to the reboot, as opposed to statistical variation in the measurements.

The average difference between the two trials was 109 (with a standard deviation of 141) or roughly a 1% performance increase.  However, the t-test gave a result of 2.44 (with a probability of 96%), which is a .02% performance increase; the 1% increase that the average shows is mostly statistical noise. So the actual performance increase is completely negligible.

The takeaway message is that Google has done a very good job in implementing multitasking in Android and rebooting your device won't help you eek out any additional performance.






Friday, April 5, 2013

Worry Less About Moore's Law

You can't read more than a couple of articles about upcoming CPU's or chipsets without running into one claiming that Moore's Law is coming to an end. Moore's Law is usually (and incorrectly) stated as "microprocessors double in speed every 18 months". They then claim that with the end of Moore's Law, we'll stop seeing improvements in computer technology.

The first thing wrong with Moore's Law is that it's not a law, but an observation. Laws are unbreakable, but observations are. Second, Gordon Moore said that it was the transistor density that was doubling, not the speed. And third, he noticed that it was doubling every two years, not eighteen months. And fourth, the observation was made regarding silicon semiconductor technology.

Interestingly, Moore made the observation in 1965 using data from 1958 to the then-present. From there, the Law has actually held up pretty well for the last fifty years. The reason for this may simply be a case of the prediction leading the technology. If transistor densities have doubled like clockwork and you know your competition is also working on the next doubling, then you are well-motivated to make it happen.

Even so, Moore's Law is coming to an end. Silicon transistors can only get so small; single atom transistors are as absolutely small as you could ever hope to get and for silicon, that's about 0.25nm. Right now, Intel's latest chips use 22nm technology (which is half of the feature width). So, there is a limit and we are approaching it. There is no way that transistor densities can keep doubling for the next 20 years without having to have transistors smaller than a single atom.

But I'm not worried and here's why. First, Moore's Law makes a prediction about transistor size, not processing speed. Current chips have over a billion transistors on them and when you have that much wiring, there is invariably room for improvement. In fact, the current battle between Intel's x86 instruction set and ARM's RISC instruction set is evidence of that. If there was a "right" way to hook up a billion transistors, Intel would do it and be done with it. But the fact that they have to compete with ARM and AMD shows that there is still competition in the architecture regime.

Also, even for a fixed transistor density, you can increase the number of transistors by increasing the chip area or going three-dimensional. All chips today are made in nearly flat 2D layers, but research has been done into layering transistors. If this were accomplished, you would have an overnight doubling of transistors on a chip. And since each layer is only a few nanometers thick, you can have thousands of layers before the chip becomes appreciably thicker. Of course, there are a lot of challenges in this arena, specifically in heat dissipation and signaling, but it is a way forward.

Moore's Law is also an observation on silicon semiconductor integrated circuit technology. The original electronic transistors were vacuum tubes, which were then replaced by individual transistors. Those were replaced by silicon integrated circuits which we use today. It is inevitable that someday we will move away from silicon to something better. We may figure out how to make transistors at the subatomic level, or quantum computing might finally mature. But most likely, it will be some technology that we haven't even heard of yet that will replace it.

In the short term, Moore's Law looks like it still has a bit of life left in it. I expect that within this decade we will see the first signs of a transition away from silicon to a similar technology (such as gallium-arsenide transistors). Then, shortly thereafter we'll see a transition away from the integrated circuit technology that has been the center of computing and technology for more than half a century.

If you feel like checking it out for yourself, Intel has made a copy of the original 1965 paper available on it's website.