Lion of the Blogosphere

Archive for the ‘Technology’ Category

New Apple announcement

with 20 comments

There is actually very little here.

There are two new overpriced “RED” versions of the iPhone 7 and 7+. Only difference is the color. Big deal.

The iPhone SE, the most affordable iPhone, is the same price but comes with twice as much storage as before, so 16GB is doubled to 32GB and 64GB is doubled to 128GB. This makes the iPhone SE even a better bargain than before, if you don’t mind a smaller phone.

The iPad Air 2 is replaced by a new iPad just called an iPad. It has an A9 processor instead of the A9X processor in the iPad Pro (but there’s only a very slight performance improvement with the A9X), but costing only $329 it’s a real bargain compared to the $599 list price for the iPad Pro (although I only paid $427.50 for my iPad Pro).

What do you get with the Pro that you don’t get with the iPad? Support for the pencil and keyboard, four speakers instead of only two, a better camera (as if you really are going to use an iPad as a camera), and a superior display with superior anti-reflective coating, but both models have the same pixel count. Also the pro is slightly thinner and lighter (0.96 pounds vs 1.03 pounds).

A very underwhelming announcement. The main takeaway is that Apple is making it’s entry-level iPad and iPhone less expensive than before while not changing anything about the more premium products.

* * *

Although the new iPad has a more advanced chip (A9 vs A8X) than the iPad Air 2 it’s replacing, it’s also thicker, heavier and loses the anti-reflective coating, so it’s actually a less deluxe cheaper inferior product.

Written by Lion of the Blogosphere

March 21, 2017 at 10:47 am

Posted in Technology

iOS v. Android

with 53 comments

I don’t think I said anything about this in the previous smartphone posts, but I definitely prefer iOS, with the caveat that the last Android phone I used was a Samsung Galaxy S3, but I found the Android just a lot more confusing to use than iOS, plus there’s the fact that every Android manufacturer puts its own interface layer on top of the generic Android making Android even more confusing. I recently had the displeasure of fiddling with a new LG Android phone and I didn’t particularly find it intuitive to use.

As far as which environment you should learn how to program for, definitely iOS, because Androids have to be programmed in Java which is the worst possible language to use. Meanwhile, Apple has been working to improve Xcode even more. Two years ago they came out with a new programming language, Swift, which is said to be a lot easier to use than Objective C, the old programming language for iOS and OSX.

That’s why applications first come out for iPhone, because the creative white guys program for iOS and then they hire a bunch of Indian H-1B types to do the Java/Android conversion, or maybe just outsource the whole thing to India. That’s my impression anyway, I could be wrong about that.

But it’s a fact that iPhone apps are more profitable because iPhone owners have more disposable income so they buy more stuff. When I had access to the web analytics of the company I used to work for, I could verify that two thirds of our mobile sales came from iOS and only one third from Android. The result is that you get better apps for iOS because companies put more effort into the more profitable platform.

* * *

It’s kind of surprising that Trump uses an Android instead of an iPhone.

* * *

There was once a time when Apple products really rubbed me the wrong way. It’s something about the obnoxious glowing Apple logo combined with the smugness of people who use them.

I now have come around to appreciating the quality of Macbooks, but believe it or not, I still prefer Windows over OS X. OS X has a lot of weird annoyances, like the mouse movement just not working the right way and with no way to adjust it, plus the OS X interface often seems sluggish compared to Windows even though my Macbook Air has a Pentium i5 in it which is a pretty powerful chip. Plus you can’t really use a Mac to play games, at least not games with 3D graphics, because there’s no affordable Mac where you can add a graphics card. You have to spend at least $1800 on a Mac in order to get a discrete (but non-upgradeable) graphics processor instead of the GPU built into the Intel CPU. You can add a $110 NVidia GTX 1050 to any cheap PC that has a PCIe x16 expansion slot and it will outperform a $2000 iMac. Consequently, many game companies don’t even bother to port their games to Mac, because there are so few Macs out there with the ability to run modern games with 3D graphics.

Although good news for World of Warcraft fans, it has modest graphics requirements and will run on all of the current-model Macs! Nevertheless, you have a whole class of applications, games, that are available and run well on Windows but not on Mac, and there aren’t any programs I know of that you need a Mac for, with the exception of developing iOS and OS X applications using Xcode. Mac is often associated with creative software, but Photoshop runs just fine on Windows, as well as all other Adobe programs.

Of course, some might see the inability to play addictive time-wasting games as a benefit of Mac.

* * *

I’d say that the key reason why Mac became so popular during the last 10 years is because they came out with Macbook laptop computers that just blew away all Windows laptops as far as the perceived and actual quality. A Dell laptop feels like a cheap yet massively heavy hunk of plastic compared to the sleek, lightweight and all-metal-clad Macbook. And Apple was way ahead of all of the Windows laptop manufacturers as far as adding the latest technologies like high-res displays, SSD drives, and obtaining the longest battery life.

Mac desktops, on the other hand, have always been and remain extremely overpriced and non-upgradeable compared to Windows computers. And since you just keep a desktop in some out of the way place on your desk, or even beneath your desk, they aren’t useful status symbols the way laptop computers are. A big heavy tower of cheap plastic is perfectly acceptable for a desktop computer.

Written by Lion of the Blogosphere

March 7, 2017 at 6:06 pm

Posted in Technology

The value of smartphones

with 24 comments

Jjbees writes:

Please write more about smartphones and how everyone is addicted to them and it destroys Civic culture and acts as a security blanket for adult discomfort with being in public.

I’m sorry to disappoint you, but I’ve come to see the value of smartphones. So much time previously wasted doing nothing can now be spent reading the news, or even my blog comments.

I don’t recall that, before smartphones, being in public was any more civic than it is today. Ghetto youth using smartphones are far preferable to ghetto youth who are bored, so I suspect that smartphones are one of the reasons for why violent crime has decreased (although smartphones have led to a new class of nonviolent crime, smartphone theft).

I’d like to learn how to program iOS apps, but I think I’ve gotten too old to learn new programming languages. Every time I try to learn Xcode I just find it so boring, unlike when I was younger and I liked that stuff.

Written by Lion of the Blogosphere

March 7, 2017 at 4:50 pm

Posted in Technology

Moore’s Law thoughts

with 77 comments

According to Wikipedia, Moore’s Law, which is an observation and not a natural law of physics, states that the number of transistors on a chip doubles every 18 months. So when engineers from Intel, for example, say that Moore’s Law is still going strong, they mean that the number transistors on Intel’s most powerful chips continue to increase at that rate.

Throughout the second half of the 20th century, the consumer experience tended to parallel the increasing number of transistors on a chip. That is, chips with twice as many transistors led to personal computers with twice as much apparent computing power (in other words, they ran software twice as fast) for the same price or even less than computers 18 months older.

The year 2004 is a key year in the significant slowing of personal computer performance. That’s the year when clock speeds reached a plateau at which they have been stuck ever since. Up until 2004, smaller transistors allowed for faster clock speed which has a direct correlation with how many instructions the microprocessor can execute per second. But in 2004, the clock speeds reached a maximum such that any higher speeds caused the chips to overheat.

Modern computer chips have multiple cores. Two cores are now standard even in low-end computers, and more powerful personal computers have quad core chips. However, only specialized software can run on multiple cores at the same time. The vast majority most [commenters pointed out various examples of programs that can parallel process stuff] of tasks you do on your computer only use one core at a time, and there isn’t any practical benefit to putting more than four cores on a personal computer. So it’s impossible to make PCs run current software any faster in the future by making 8-core or 16-core chips.

It also seems to me that the rate of price decreases have significantly slowed in the recent past. No longer is the most current chip the same price or even less expensive than an 18-month-old chip with half as many transistors.

The end result is that between 2004 and 2012, the yearly increase in apparent computing power of personal computers slowed down a lot. It’s my observation that around 2012, the rate slowed down even more. It seems to me that it has been increasing at less than 10% per year since 2012, whereas during the 1980s and 1990s we got used to something like a 60% performance increase per year. (When you compound 60% per year, you get more than a 10,000 times performance increase in a 20 year period, which is what happened between 1980 and 2000.)

That explains why I’m still using a six-year-old computer. That explains why computer sales are in the doldrums, there’s no longer a huge benefit to upgrading.

* * *

It’s my advice that if you have a computer with an i3 or higher chip, you will get a much bigger apparent performance boost from updating your HDD to an SDD compared to buying a new computer with an HDD.

* * *

The apparent performance of iPads increased ten times between 2012 and 2016 based on my tests, which is a 78% increase per year. However, that’s going to come to an end, because iPad performance is always going to be slower than a desktop computer (given that the iPad has power and size constraints), and the current iPad Pro 9.7 is only 29% slower than a similarly priced desktop computer.

Apple has made a lot of money because consumers felt the need to upgrade their iPhones every two years, but I predict that’s coming to end because their iOS devices have caught up to desktop computers and will only become 10% more powerful per year going forward. (Maybe Apple’s secret weapon is that you can’t replace the lithium ion batteries which have a limited lifetime before they no longer hold a charge.)

Written by Lion of the Blogosphere

March 2, 2017 at 1:59 pm

Posted in Technology

New York Magazine article on video games very similar to my 10-year-old blog post

with 22 comments

On October 20, 2006, I wrote the following in a blog post:

World of Warcraft falls into a special category of time wasting activity, because it is like masturbation. Men who are unable to get sex from women often find masturbation and pornography to be better substitutes than nothing at all. World of Warcraft provides a similar fix for men who are unable to get status in the real world.

In the real world, men start out with the dream that they will advance and increase their status. But then they discover that it’s not so easy to increase one’s status in the real world. For example, one can devote three years of life to attending law school only to discover that law school was a a complete waste of time. In the real world, career tracks usually determine if your status will increase, and the fast track to success only holds a few people. Most people toil away at jobs where they never see any direct benefit from their hard work.

This is where World of Warcraft comes in and meets people’s unmet psychological needs. In WoW and similar games, your status increases slowly but surely every time you play. After so many hours in the game, you can see exactly how many more experience points you have, maybe your level has increased, maybe you have better armor or weapons than you had before. Unlike the real world, where you can work 40 hours of overtime and not even get paid for it, if you put an extra 40 hours into WoW you will definitely have something to show for it. Your status within the virtual world of WoW will have increased in ways you can clearly ascertain.

Have you ever woken up from a dream that was so much more pleasant than real life that you wish you could fall back to sleep and return to the dream? Unfortunately (or maybe fortunately) this never works, and you start your day off with a touch of sadness that the wonders of the dream can’t be realized. For some, World of Warcraft is like a dream they don’t have to wake up from, a world better than the real world becaue their efforts are actually rewarded with increased status.

I have no doubt that Frank Guan stole my original ideas in his New York Magazine article Video Games Are Better than Real Life. Read the article and see if you agree.

Written by Lion of the Blogosphere

February 27, 2017 at 9:35 am

Posted in Economics, Technology

iPad as ebook reader

With my older iPad 4th gen, it was a tossup for indoor reading whether I preferred the iPad or a dedicated Kobo ebook reader that uses an e-paper display with a backlight.

I appreciated the larger size of the iPad, what with me getting older and having presbyopia, but it was heavy enough to be fatiguing and there was the glossy screen.

With the lighter weight of the iPad Pro and the less reflective screen, for indoor reading the iPad Pro is a clear win over the Kobo reader (unless you’re really opposed to a lit screen, which doesn’t bother me because I’m used to spending hours a day looking at a computer screen, and I suspect all of you are also used to it).

I took the iPad Pro outside around noon on a February day in New York City with hazy sun, and I found that the new display on the iPad Pro was perfectly acceptable in the shade, and still readable in direct (hazy February) sunlight although the Kobo reader with the e-paper display is much preferable for the February midday hazy direct sunlight.

Although in the previous iPad Pro post I pointed out the iPad Pro runs about 10 times faster than the older iPad, the older iPad is more than fast enough to read books and in fact is much zippier than the Kobo reader.

* * *

A few years ago, a co-worker had his iPad stolen by black youth in a grab and run (he didn’t say the perp’s race but I’m reading between the lines given the demographics of the location where the crime happened) while waiting for public transportation. So there’s a benefit to using a cheaper e-paper reader on public transportation, one that doesn’t have all your passwords and other secret stuff embedded in it.

Written by Lion of the Blogosphere

February 21, 2017 at 12:06 pm

Posted in Books, Technology

iPad Pro 9.7 review

This is a big upgrade from the iPad 4th Generation. It has twice as much RAM, and the base model has twice as much storage, 32GB vs 16GB.

I ran the Google Octane benchmark (which measure JavaScript running in a web browser) on a bunch of different devices and got the following approximate results:

2016 Pentium i3: 28K+
iPad Pro 9.7: 20K+
6-year-old Pentium i7: 20K+
iPhone 6: 8K+
iPad Air first generation: 6K+
iPad 4th gen: approx 2K (crashes before completing the test)
iPod Touch 5th gen (that’s the previous generation): approx 1K (crashes before completing the test)

So you see there has been a huge increase in the computing power of iOS devices over the last several years. And it’s pretty impressive that the iPad Pro is as powerful as a pretty powerful desktop computer that was state-of-the art when I bought it six years ago.

How fast does your device need to be for regular use like just surfing the web? Well it seems that web pages are more and more loaded down with massive JavaScript that requires more computing power to run with acceptable speed. I think that the iPad Air first gen with its 6K+ score is the minimum power for a decent web-surfing experience. And as noted, complex stuff crashes on devices older than that.

Similarly, anything older than the first generation iPad Air is probably going to lack the horsepower to run an increasing number of new apps. One app which doesn’t run on my old iPad which I can run on the new iPad Pro 9.7 is the Purify ad blocker for Safari. Goodbye to ads! On the other hand, Vainglory, which is a pretty complex and graphically detailed real-time game, still runs acceptably on the old 4th gen iPad.

The weight difference is also a big deal. The iPad Pro 9.7 weighs the same as the iPad Air, but is 7 ounces lighter than the older model iPads. I notice it right away. It’s so much less tiring to use the iPad Pro compared to the older iPad.

The screen is improved. It’s still shiny, but less reflective. It’s less distracting when watching a movie because you see less of a reflection of your environment on the screen. But it’s still not 100% non-reflective.

There are four speakers instead of two, and sound is much improved, although for any application where you really want to appreciate the sound, you still need to use earpods (or other headphones). Even with the improved sound, I still wouldn’t really want to watch a movie with just the internal speakers.

Conclusion:

Worth the upgrade if you have an old iPad, but probably not worth upgrading from an iPad Air. Unless you need to use the new iPad Pencil which I haven’t purchased so I can’t say anything about it.

Written by Lion of the Blogosphere

February 20, 2017 at 2:44 pm

Posted in Technology

Atari: Game Over (2014)

This is a documentary, available on Netflix streaming, about Atari.

The best parts of the documentary are about the early days of Atari, how it operated, what the engineers who worked there were like.

(1) All of the game developers shown are nerdy young white men. Not a single Asian anywhere. Most of the men were skinny, pale skin, many had poorly trimmed beards. Classic nerds.

(2) The management structure was very bottom up. The engineers themselves had autonomy to design the games and program them. A single programmer would turn out a game in a few months.

This is totally unlike modern IT which is very top-down, with product managers and designers telling the code monkeys what they’re supposed to program. In some high-level language like Java or C#. Not like those early programmers at Atari who were programming in Assembly language, or even writing direct machine code without an assembler. No object-oriented crap, no unit tests, just pure hacking. (That’s a technical detail I’m filling in for you that wasn’t mentioned in the documentary.)

It think this demonstrates the amazing creativity and productivity that is unleashed when you get a much of smart nerdy white guys together and give them the autonomy to unleash their talents.

And they enjoyed themselves. The game developer featured in the documentary said no job after that ever compared to the fun and intensity of working at Atari. He recently became a psychotherapist, which is he says is the first job he liked since Atari.

Unfortunately, the majority of the documentary is not focused on the programmers at Atari, but rather on a current-day quest to dig up a garbage dump where millions of old Atari cartridges were believed to be buried. Who cares about that?

I suspect that the person or people who produced this documentary played videogames themselves but otherwise have no knowledge of either business or software development. They never explain exactly why Atari went out of business. I assume it’s because management massively expanded the business and hired thousands of unnecessary employees with correspondingly expensive office space, based on sales projections which turned out to be vastly overoptimistic. Or maybe they invested huge amounts of money in new projects that failed. But that’s just speculation. I could be wrong.

Written by Lion of the Blogosphere

February 19, 2017 at 2:32 pm

Posted in Movies, Technology

Men who play video games instead of working

I’ve had this Washington Post article bookmarked for months, but never wrote a blog post about it.

“When I play a game, I know if I have a few hours I will be rewarded,” he said. “With a job, it’s always been up in the air with the amount of work I put in and the reward.”

That sounds exactly like something I wrote about World of Warcraft in 2006 (more than 10 years ago!):

… Most people toil away at jobs where they never see any direct benefit from their hard work.

This is where World of Warcraft comes in and meets people’s unmet psychological needs. In WoW and similar games, your status increases slowly but surely every time you play. After so many hours in the game, you can see exactly how many more experience points you have, maybe your level has increased, maybe you have better armor or weapons than you had before. Unlike the real world, where you can work 40 hours of overtime and not even get paid for it, if you put an extra 40 hours into WoW you will definitely have something to show for it. Your status within the virtual world of WoW will have increased in ways you can clearly ascertain.

The question is, are video games the cause of men retreating from the conventional workforce, or a symptom?

I do think that video games, as well as other high-tech diversions like internet, social media, high definition TV, make being out of work more bearable and to some extent demotivates people from wanting to get back into the labor force (which for people without self-actualizing jobs is often unpleasant).

Written by Lion of the Blogosphere

February 15, 2017 at 12:56 pm

Finally figured out how to read LA Times without ads

latimes-javascript

The LA Times has updated its anti-adblock popup to defeat even the Brave Browser.

However, I finally found a simple workaround. Go into your Google Chrome settings and block JavaScript from working at the LA Times. Without JavaScript, the anti-adblock JavaScript just stops working!

(Without JavaScript, the site is missing some content that is post-loaded via AJAX calls, but it’s still better than having ads, and it also makes the site run a lot faster.)

Written by Lion of the Blogosphere

January 20, 2017 at 5:28 pm

Posted in Technology

%d bloggers like this: