mobile photography technology, culture and community

Google Camera app now on Kitkat and adds Lytro-like effect

Lens Blur simulates the subject isolation you usually get from a DSLR and a fast lens. 
In the unmodified image both subject and background are in focus.

So far Google's in-house camera app has only been available on Nexus devices but now a new version of the app is available to anyone in the Google Play Store. For now it only works on devices running Android Kitkat 4.4 but Google is planning to make the app work on older versions too.

The new app comes with the minimalist design we know from the vanilla Android version and offers the Photosphere 360 degree panorama feature, an improved standard panorama function, and a brand new feature called Lens Blur. The latter uses the device's processing power and some clever software trickery to simulate the shallow depth-of-field you get from a large sensor and fast lens combination - allowing you to isolate your subject from the background.

To achieve this, the camera takes a series of images while you "sweep" it slightly upwards. It then uses the captured information to create a 3D-model, similar to what the secondary camera module does in HTC's new One M8 flagship phone.

The Google Camera app features a minimalist design.
It comes with a number of specialist shooting modes, including a new Lens Blur option.

Lens Blur also allows you to change the point of focus after the photo is taken like a Lytro Light Field camera. Via sliders, you can also simulate different apertures. The Google Research Blog gives a good insight into how the system works. 

Google's samples look good and better than what we've seen from the HTC One M8 but we'll have to do some testing ourselves to find out how well the system works. If you want to try Lens Blur and the features yourself, and you own a device running Android 4.4, download the app for free from the Google Play Store.   

Source: Google | Via: Petapixel



Total comments: 30

I have tested this on Nexus 5, and works very well. Hey Apple can you do this?


Google does some great things, but their app lifecycle is horrible.

1. Release beta, partially functioning code.
2. Fix the bugs, add features, get it all working.
3. Replace it with a 'streamlined' version with 90% of the features removed.
4. Repeat.

We got a blur mode, and lost all the manual control and pretty much everything else. Sigh.


The whole app is SO much better. It is usable now.


You should do this with the app AfterFocus. Instead of letting the phone do it quickly and getting it wrong, It's better that you create the masks. The selection tool is top notch.

Edited 44 seconds after posting

Certainly not.

1 upvote

Theoretically you can now view Google Android Kit Kat generated photos on a 32 bit Windows computer with a 4.4.2 Kit Kat OS Install . Interesting?

Have those hacker privacy issues found in India with 4.3 and 4.4 Android devices been resolved yet?


Why should that be interesting?


That looks pretty damn good!


I've continued testing: here's my latest album of shots with both blurring and without (using exactly the same camera position) shooting subjects from low (about 40 cm), middle (about 1m) and high (about 2m) distance. The full set is at

As you can see,

- in the mid-distance shots (a 17” MBP with its surroundings shot from the front, displaying this very Web page), there is some very serious artifacting in both blurred shots:

On the first image, the upper right corner of the screen bezel of the MBP is awful. So is the upper bezel of the on-screen Nexus phone on the right, right over the “Google Search on Android adds voice commands for camera” title.


1 upvote

(continued from above)

On the second one, it's the on-screen Nexus phone on the left that has a completely messed-up upper bezel. In this shot, the center part of the left bezel of the MBP is awful.

- the low-distance shots are significantly better.

- the high-distance ones are passable.

All in all, based on my experiments, you'll want to use the new blurring feature with subjects as close as possible – preferably under half a meter.

Ignat Solovey

...which almost completely kills the need in this feature, at least on modern hi-end smartphones (I own Sony Z1 Compact with 1/2.3" sensor and I did not update it to 4.4.2, being fine with rooted 4.3.1): you can't cheat basic physics and you'll get some sort of background blur anyway: 5.2/2 lens, used in camera module in question, when focused on a subject 50 cm away has near DOF limit at 31.5 cm and far limit at 120,6 cm. So, when I take a picture of something this close, I get objects two and more meters away quite heavily blurred, although distinguishable. Hyperfocal distance for this lens is 85 centimeters, and noticeable software processing artifacts are almost inevitable as for now, at least with high-contrast subjects like selfie in the article. Also, most people, while making selfies in front of something like landmark, expect the background to be completely in focus, and there are more of them than "big cam effect" lovers.


Many won't like it as it's some "digital magic" compared to a good old sensor and expensive lens. But in a few years we will be able to make clean low light 4k videos in 3D with shallow depth of field effect at night at ISO 25.000 on our 500$ smartphones and the results might look better than from todays 100.000$ cinema equipment.

That's technology, cheers on the engineers who make such things possible and affordable!

Edited 3 minutes after posting

which makes you wonder, what will the $100k cameras of then look like?


Seems to produce a 2.5MP image (out thereabouts), Lens Blur that is. How does that compare to the HTC/Samsung implementation? I only tested it briefly, in low light no less which produced some clearly visible edge artefacts...

Frankly I'm not all that interested in what changed and got added or removed, as a phone camera it's still serviceable for what I use it, BUT you missed by far biggest improvement... The preview FINALLY displays correct aspect ratio instead of a cropped 16:9 view!

That was my biggest issue with the old UI, nevermind the circular menu, the preview made accurate framing impossible. They also added grid lines FWIW, lots of people still asking for 3:2, 16:9, and even 1:1 output crop modes tho.

Android Central has a really good article covering all the changes in detail. The settings button is really easy to miss on the opposite corner of the modes menu btw.


The HTC implementation in their new dual-sensor handsets generally produces pretty bad results.


Yeah I think it was a waste for them to sink so many resources into it and go as far adding a second camera, not to mention dropping OIS... Hopefully they'll give up on dual sensors next time, again.


well... searched a bit and it seems that the result are as disappointing as the one from htc "dual camera" (even it could be slightly better on the google app).

Here is a link to a sample i've found:
the upper part of rear wheel of the bike shows some terrible result, so are the contours in general, and 2 "focus" spots in the plain background...


" the result are as disappointing as the one from htc "dual camera""

Not all of them. The bicycle shot was indeed absolutely bad because of the dual focal planes (the background and the spoke). However, I've only had satisfactory or even good results with less "tricky" scenes.


I've also seen "satisfactory" resuls... but even those, generally are ok resized at someting like 300x200 pixels.
But as a (so much advertized) feature it is not really useful. I didn't expect to compete with SLRs or something, but it is just not working yet. At all.


BTW, a quick correction: "It then uses the captured information to create a 3-model" - you meant 3D, not 3.

Lars Rehm



I've very thoroughly tested it on my (factory; no rooting) Nexus 7 2013 and found out the following:

- the new panorama support is GREAT, particularly if you enable maximum resolution (the default is high-res). Up until now, Google's implementation was a joke - far-far inferior to either Apple's one on the iPhone 4S+ or Samsung's implementation in their Android handsets. Even Nokia's WP (but not Symbian) implementation has been significantly better.

- blurring worked just GREAT in my tests. While some people did complain about it being slow(ish), I haven't noticed speed problems on my N7, which, while "only" having a 5 Mpixel sensor, has a, compared to the SD800, significantly slower CPU.

What's wrong? Most importantly, all manual modes have been removed, which is a BIG-BIG minus. There's no

- scene selection
- manual WB and ISO setting
- timer

The first two is particularly painful as, with the new Camera app, you in no way can force the system to shoot at high shutter speeds.

1 upvote

Only manual exposure compensation has remained. (Which means that, in this regard, Android is still superior to iOS, where there's not even proper exposure compensation. See my writeup on the implications of this at if interested.)

Fortunately, some people at Android Police discovered Google may have not ditched these manual settings entirely and they may add them back some time in the future. (More info: )

Lars Rehm

thanks for the summary. I have only briefly tested the blur feature so far and it seems to work quite well. Hopefully I'll get some more testing time on the weekend ;)


(EDIT: ignore; will re-upload the images soon)

Edited 5 minutes after posting

It's clever coding no doubt about it. I suppose it isolates the subject, determining the edge / borders, and then applies some kind of selective gaussian blur to what the code determines constitutes the background. I eagerly downloaded and took some selfies; when viewed on my phone, I thought 'Google, I hate you! My beloved Zeiss Sonnar 1.5 is redundant!', soon to be change to, 'Ah, no chance' after viewing at full resolution. It does a very good job, but the fact that you have to ** move ** your camphone as you actually take the picture is a recipe for blurring where you don't want it (i.e. the subject). For web posting and small viewing, it works very well indeed, but it has a long way to go before the 1.2L Canons have anything to worry about. I'm sure they'll get close sometime soon though. Nicely done though Google coders! Ah, now back to my precious Sonnar 1.5 :)


It builds a depth map from multiple images, and applies blur to each pixel in proportion to its longitudinal distance from the focal point. You can dynamically place the focal point anywhere you like in the photo.

Edited 3 times; latest 4 minutes since posting

Can the Google Camera software be used to process photos taken by another camera? I want to upload photos to Google Plus for processing by the Google Camera software for the blurring effect


Clark666, nope. Some iOS HDR / pano stitching apps accept files directly imported from the Camera Roll. Unfortunately, Google Camera's new version doesn't accept anything. Hope such import capabilities are added some time.


Why would they, or how even, doesn't the depth map rely on data gathered during the unique photo taking process where you swivel phone over the subject? Without that the blur effect would be worse still.

Total comments: 30
About us