TechByter Worldwide

If you enjoy today's article, please share it!

Program Date: 11 Aug 2013

Clarifying Blurry Pictures with Photoshop CC

Two of the new features in the latest version of Photoshop are designed to improve specific kinds of images. The Camera Shake Reduction filter can improve an image that is smeared because the camera moved during the exposure and a Preserve Details option might save the day if you have to do something graphics professionals say you should never do—increase the size of an image. Let's take a look at these two features.

A Bad Case of the Shakes

Camera shake is a particularly vexing problem with close-up photography and when you're using a long lens in relatively low light. For long lenses, one of the general rules that photographers keep in mind is not to use a shutter speed that's less than the reciprocal of the length of the lens. That sounds complicated, but it's really simple. Let's say you have a 200mm telephoto lens. The reciprocal of 200 is 1/200th.

So the shutter speed shouldn't be any slower than 1/200th: 1/250th would be fine but 1/125th would be in the danger zone. At least that's the way things used to work but digital cameras have simultaneously made things both better and worse.

BETTER: You might have a lens that stabilizes the image. Nikon calls these lenses VR (vibration reduction) and Canon uses the term IS (image stabilization). Most point-and-shoot cameras do not have the feature but when the feature is present, the technology might allow you to use a shutter speed that's one or two stops slower than what you might expect. In the previous case, 1/125th would still be acceptable and even 1/60th might result in acceptable images.

WORSE: Unless you have a great deal of money, the camera you're using probably doesn't have a full-frame sensor. Sensor designations make no sense because the sizes are based on 1950s technology that described the size of sensors used in television cameras. Fortunately, there are only 2 main sizes in digital SLRs intended for consumers and semi-pros, either APS-C or Micro Four Thirds. Actually, there are slight differences between Nikon and Canon on the APS-C standard, but let's not go there.

Because these sensor sizes are smaller than standard 35mm film frames, they have a multiplier effect on the lens. The Nikon APS-C multiplier is 1.5, Canon's APS-C multiplier is 1.6, and the Micro Four Thirds multiplier is 2.0. That means that a 200mm lens is the equivalent of a 300mm lens on a Nikon APC-C camera, 320mm on a Canon APS-C camera, and 400mm on a Micro Four Thirds camera. So now your minimum shutter speeds are somewhere between 1/300th and 1/400th of a second.

Image blur is a challenge for those who take close-up photos for different reasons, but the result is the same.

Click for a larger view.Adobe provides reviewers with images that can be used to test the various functions and I use those images for testing. When it comes time to explain how the feature works, I generally use one of my own images, or at least one taken by someone in the family.

This is a salad that my wife created and then photographed. There's a bit of blur as you'll see when you click the image to see a larger view.

Click any of the smaller images for a full-size view. Press Esc to dismiss the large image.


Click for a larger view.Here's an enlarged view of just the center of the image. At this magnification, the blur is readily apparent.

One of the reasons that I use my own images for testing is that I can select an image that will be more difficult for the automated process to deal with. In this case, the blur is quite easily seen, but it's also small. That will make it difficult for the process to identify and fix.

So let's see how well it works.

Click for a larger view.Shake Reduction is applied as a filter in the Sharpen section, so selecting it is as easy as selecting any other filter.

Click for a larger view.When you click the image to see a larger view, you'll notice a rectangle that shows the area of the image that Photoshop thinks should be repaired. You can move the center of the rectangle or change its size.

And, as you'll see in a moment, you can add others because the type of blur in one part of an image might not be the same as blur in another area.

Click for a larger view.Here's the result and it's quite an improvement. This is never going to be an outstanding image, but now there's more apparent detail.

The lower right area needs some help, too, I think.

Click for a larger view.Here I've added another Shake Reduction area.

Click for a larger view.And here's the final result. Be sure to compare it to the first image in the sequence.

You can control the amount of sharpening, too. I've pushed it a bit harder here than necessary to emphasize the differences between the before and after versions.

When All You Have is a Small Image

Click for a larger view.Another common problem is having only a small image that you'd like to print as an enlargement. A photo that's 640x480 will look just fine on the screen, but if you try to print it as a 5x7, that image will have an effective resolution of just 90 dots per inch and a photograph needs 200 to 300 dpi resolution for decent quality. For an 8x10, that small image's effective resolution would be about 65 dpi.

Photoshop has had the ability to upsample images for years and can achieve acceptable results, but the new intelligent upsampling improves the process.

Here I've started with one image of (what else?) a cat. Scampi looks just fine at this resolution but I'm going to create a much larger copy that could be printed. I'll do a side-by-side comparison with the old method on the left and the new method on the right.

Click for a larger view.I'm enlarging the image to 400% of its original size so the final size will be 2560x1920 pixels. This is enough for a print resolution of about 250 dpi for an 8x10 photo.

I've selected the option that previously would have provided the best possible result: Bicubic Smoother (enlargement).

Click for a larger view.Then I repeated the process with the image on the right. The settings are identical except that this time I selected Preserve Details (enlargement).

A 400% enlargement is an enormous change. Photoshop will need to fill in many megabytes of details that simply aren't present in the original. It's the difference between a photo that holds a bit less than 1MB worth of data and an image that holds a bit more than 14MB of data, so Photoshop is going to have to make up 13MB of information that simply isn't there.

Does "interpolate" sound better than "make up"?

Click for a larger view.Here's the side-by-side comparison, old way on the left and new way on the right.

Yes, there's a lot of sharpening and noise in the image on the right, but that means that the "made up" parts of the image will appear to have much more detail.

This is another case in which the resulting image is never going to be an outstanding photograph, but it might be enough to serve the intended purpose.

The original and both of the enlarged images are below. When you click the 640x480 image, the pop-up will be of the full size image. The other two images will be less than half of their enlarged sizes, so I'm providing links that will allow you to download the full 2560x1920 versions for review.

Click for a larger view.The original Scampi cat.

Download the full-size (640x480) image. [About 310KB]

Click for a larger view.Scampi enlarged with the bicubic smoother method. This is the best you could do before the current version of Photoshop.

Download the full-size (2560x1920) bicubic (old style) image. [About 1MB]

Click for a larger view.Scampi enlarged with the preserve details option.

Download the full-size (2560x1920) preserve detail (new style) image. [About 1MB]

 

These are two of the new features in the Creative Cloud version of Adobe Photoshop. Because the applications are now being updated constantly, I have chosen not to rate individual applications or even to show all of the new features of any one application. Instead, I'll tell you about the features that seem most noteworthy as I identify them.

Details and pricing for Creative Cloud are on Adobe's website.

Desktop Computing's Antediluvian Past

In many ways, Adobe Software is leading us toward the future of computing: For example, software that is effectively leased or rented with a license that has set start and end dates. Software that can automatically update itself as new features are released. Was anyone thinking about this in 1979? Probably not, but you might be surprised to see just how far in the future some people's minds were.

Click for a larger view.I've talked about the Xerox Palo Alto Research Center (PARC) previously because it is the place where most modern computing was invented: The graphical user interface, the mouse, many of the protocols that run the Internet, object-oriented programming, the laser printer, desktop computers, and even notebook computers all trace their heritage back to PARC. Xerox was never able to wring a profit out of PARC and had the operation been located closer to the company's headquarters in Connecticut, it probably would have been shut down.

Fortunately, it wasn't and the engineers there gave us the earliest versions of the tools we now take for granted.

Click any of the smaller images for a full-size view. Press Esc to dismiss the large image.


The Alto could be considered the first desktop computer and I've mentioned it before. The screen sat on the desk, along with a keyboard and mouse, but the main part of the system actually resided under the desk. Thanks to listener Wolfgang Gunther, I have had a copy of the Alto User's Handbook since January and looking through it has provided some fascinating insights.

There are stories of PARC engineers looking at pizza boxes and wondering why a computer couldn't be constructed that would be about the size and shape of a pizza box. It could fold open, they thought, with a screen on one side and the keyboard on the other. Of course, that was long before flat-screens were invented and when even floppy-disk drives were the size of shoe boxes. (Engineers and pizza, who would have thought!)

Click for a larger view.This manual looks nothing like today's software manuals. For one thing, most current software and hardware don't have printed manuals and only a few have even PDF manuals. Back then, the manual had to explain everything about how the system could (and should) be used. Oh, and there were no pictures or screen shots.

 

Click any of the smaller images for a full-size view. Press Esc to dismiss the large image.


A bit of sexism on page 1: As forward-thinking as the engineers were, they seemed to assume that the operator of the Alto would be a man and that the man would have no clue how to obtain replacement parts. For that, ask your secretary.

SWAT=Ctrl-Alt-Del (or Command-Option-Escape on a Mac).

Yes, file servers existed in 1979, at least at PARC. They were used for storing files and for printing. That meant network technology was available, too. Keep in mind that this was just 9 years after the Advanced Research Projects Agency Network (ARPANET) had been expanded with connections from coast to coast.

Printing was complicated in those days. It's a bit less complicated today, but printers still have the capacity to befuddle even talented programmers. Perhaps that should be especially talented programmers.

Yes, typefaces were available back then, well before John Warnock and Charles Geschke founded Adobe. But Warnock and Geschke were working at PARC in the Alto days.

New software releases? Well you didn't just hop on the Internet and download them because the Internet didn't yet exist. The file-transfer protocol (FTP) existed and those located at the few locations that were connected via ARPANET could obtain files that way.

No, Steve Jobs did not invent the mouse. That was Douglas Engelbart at PARC and a section of the Alto manual explained how to use it.

It was important in those days to save the document frequently, clear the computer's working memory, and then reload it. Forget to do that and you might run out of memory.

Memory (both RAM and magnetic) was incredibly expensive in 1979, so the largest document that an Alto could edit was about 65,000 characters. A typewritten page (Pica, fixed width) contains about 1250 characters, so the Alto would max out at about 50 pages.

No, Steve Jobs did not invent the windowed computing environment. The Alto made it possible to work on several documents simultaneously in 1979. Steve Jobs was 24. (If you're keeping track, I was 22.)

No, Bill Gates didn't invent the Blue Screen of Death (although Alto's wasn't blue). When something went wrong, the Alto could display a completely baffling message about what went wrong.

And if you think that macros, command-line switches, and other advanced computing capabilities were invented much later check this out.

If you'd like to see the entire manual, you can download it from the TechByter Worldwide website. It's a large file (more than 7MB) because it's a scanned copy of the original manual. They couldn't make a PDF back then because Warnock and Geschke were still working at PARC and hadn't yet invented PDFs.

But by 1979, PARC had the forerunners for much of the hardware and some of the software that are indispensable today.

Short Circuits

Chrome Can Save Passwords; You Might Not Want It To

It's not uncommon for people to have browsers store passwords. Internet Explorer can do it. Firefox can do it. Chrome can do it. Most of these are somewhat less than secure, but people continue to use them even at a time when services such as LastPass are available. You could be giving your passwords away.

At the outset, I need to make one thing clear: This is what programmers call an "edge condition", something that can happen, but isn't particularly likely to happen. If you use Chrome on a single computer and your account isn't accessible to others who might use the computer, it's really not a significant security threat.

But if you use Chrome on multiple computers and you use Chrome's synchronizing feature to update passwords on all the computers, your passwords might be at risk. This isn't something that Google plans to fix, either. It's a known problem and Chrome's managers feel that it's not a threat worth fixing.

Developer Elliott Kember has provided an explanation of how this works. Someone who's curious about your stored passwords could just open the Settings panel, display the Advanced Settings, and then take a look at Passwords and Forms. It all looks safe enough because the passwords are obscured by asterisks, but click any password and a new button appears. The button is labeled "Show" and, when the curious person clicks it, the password will be displayed in plain text.

For someone to have access to your passwords, they would need access to your account and either for you to not use a login password with Chrome or to give someone your password. So if you use Chrome and other people can gain access to your account on any of the computers where Chrome is installed, you might want to review your security procedures.

Surface Pro Now Costs $100 Less

You probably saw this one coming. Microsoft recently dropped the price of the Surface RT tablet by $150, but left the price of the more powerful Surface Pro where it was. The Pro is the tablet that can run standard Windows applications and not just Metro apps.

It's a limited-time offer, though, that ends on August 29th. The price reduction applies to both 64GB and 128GB versions. So now the prices for the tablets are $800 and $900 respectively. The published prices are, of course, $799 and $899, which some marketers still seem to think can fool people into thinking that $800 is really $700 and $900 is really $800.

Microsoft says they've seen "great worldwide success" with the lower prices of the Surface RT. There's no question that the Surface tablets have lots of useful features, but they haven't been selling well.

Surface tablets are available in stores, online, and at Microsoft's website.

Amazon CEO Jeff Bezos Joins the News Biz; Amazon Offers Fine Art

There's probably no relationship here at all, but it's been a busy week in Seattle, where Jeff Bezos bought the Washington Post (perhaps thinking it was a newspaper in Washington State instead of Washington, DC*) and then his company, Amazon.com, announced that it will begin selling fine art online.

*Just kidding, Jeff. Actually, I'm looking forward to seeing what innovative ideas you'll bring to the newspaper business. If anybody can figure out a way to make newspapers work in today's digital age, I think you're the one.

More than a decade ago, Amazon tried an experiment with the well known auction house, Sotheby's, but it didn't last very long. This time around, Amazon is working with 150 art galleries and dealers who will sell original and limited-edition artworks on Amazon Art.

Currently the site features about 4500 artists and shows more than 40 thousand works of art that range in price from a few hundred dollars to nearly $5 million. Amazon.com always offers a wealth of information about the products it sells and so is the case with Amazon Art (http://www.amazon.com/art/), which still carries a "beta" tag. The site provides background information about the work shown, the artist, and the artwork's exhibition history.

The Month of Many Changes at Yahoo

When radio stations plan to change formats, they often play the same piece of music over and over for a weekend or a week. Apparently that's something program managers learn in program manager school. Perhaps the thought is that listeners who stumble across the station will check back from time to time to see if that song is still playing. Maybe they'll even add the station the the presets in their car. Or something like that. Yahoo is trying a variation on that theme.

For the next 30 days, the Yahoo portal's logo will change each day. At the end of the period, a new permanent logo will take its place according to Yahoo's Chief Marketing Officer, Kathy Savitt.

Yahoo has been making a lot of changes recently. Making people come in to the office to work, for example, and Savitt says that the new logo will reflect Yahoo's rapid changes and its renewed sense of purpose. Perhaps we should be grateful that Yahoo didn't spend 2 years creating a mission statement that could be written by an average business administration class in an hour. (Don't laugh; it's been done.)

Savitt says that Yahoo has introduced beautiful new products that have changed the way visitors see the weather, read email, share photos, and follow sports teams. "We’ve partnered with great artists," she said, to take Yahoo "On The Road." And she pointed to the acquisition of Tumblr as another example of Yahoo's new present and its possible future.

"The new logo will be a modern redesign that’s more reflective of our reimagined design and new experiences," Savitt said in typical marketing-speak. And that's why Yahoo will display a different variant of the logo every day for a month. "It’s our way of having some fun while honoring the legacy of our present logo."

If you miss one, Yahoo will allow you to collect the entire set online. Be the first in your neighborhood to have them all! Or if you want to be among the first to see the new logo, set a calendar reminder for midnight (Eastern Time) the morning of Thursday, September 5th.

When Things Go Wrong

The TechByter website was down for most of the day on Friday, August 1. At first, I thought that I was seeing a distributed denial-of-service (DDoS) attack, but the cause was much more mundane.

BlueHost CEO Dan Handy explains: “On August 2, 2013, during routine hardware maintenance which doesn’t typically affect customers, we experienced unexpected network issues. Hardware failures, including our core routers, cascaded into further issues causing downtime and instability. Our response team jumped into action and worked tirelessly alongside hardware vendors to restore connectivity as quickly as possible.”

This particular problem is unlikely ever to occur again at BlueHost. Handy says, “Now that operations are back to normal, our engineers are conducting a thorough review of this incident and are already beginning to implement new safeguards to make our network better.”

The fact that this particular problem is unlikely to recur doesn’t mean that there will never be an outage. Other things can break and I’m sure that some of them will at some time.

During the fir st 8 to 10 years that I maintained a website (from the beginning of the Web in the mid 1990), I was forced to switch hosting providers frequently because too many people decided to offer website hosting but without having the infrastructure to support it. As a hosting company, BlueHost has provided outstanding service and a broad feature set for many years and at a modest price.

There are options. For those who are concerned about performance in a shared environment such as BlueHost offers, virtual private servers are available at prices ranging from $30 to $120 per month and BlueHost offers dedicated servers at $150 to $250 per month. This compares with $7 per month for shared hosting. Other options such as RackSpace, a well-known hosting company, has plans starting at $150 per month.

My feeling is that a $100-per-year service that provides outstanding functionality and reasonable response time (although with occasional outages) is better than a $3000-per-year service that provides outstanding functionality, slightly faster response time, and occasional outages. This is particularly the case because switching from one hosting company to another is both time consuming (and therefore expensive) as well as highly disruptive.

Sometimes the Internet is so much fun that I can hardly bear it.