Saturday, December 31, 2011

Canon Super-35mm Cinema Sensor Explained

Canon published few whitepapers talking about design considerations of its Super-35mm sized CMOS sensor for recently announced EOS C300 video camera. The first one "New 35mm CMOS Image Sensor for Digital Cine Motion Imaging" gives the sensor spec:


The color filter array is the classic Bayer. Canon explains the resolution choice:

Illustrating the separate CFA array and the CMOS imager while also
showing the CFA separated into its component color filters to better expose
the structure of their respective sparsely sampled lattices

"The image sensor readout strategy radically departs from the customary “De-Bayer” deployment of quincunx sampling of the green photosites to maximize the green video resolution (and hence the matriced Luma resolution). The design strategy of this new sensor is to not to seek any form of “4K” resolution — but rather to specifically confine the reconstruction of each of the R,G, and B video components to a full digital sampling structure of 1920 (H) x 1080 (V) — according to the SMPTE 274M HDTV Production Standard."

Showing the concept of structuring the final Green video component within the
pre-processing LSI from the two dual video readouts from the CMOS image sensor

"The dual Green process offers the following significant technical advantages:

  1. Doubles the effective saturation level of the summed Green output video
  2. Increases the noise of the final Green output only by a factor of square root of two
  3. Combination of 1) and 2) increases the effective dynamic range of the green signal — and as a consequence, that of the matriced Luma signal
  4. Increases the effective output green video bit depth
  5. The half-pixel offset between the two separate green sampling lattices — both horizontally and vertically — virtually eliminates the first order sideband spectra associated with the sensor sampling process. This eliminates green aliasing.
  6. Creates an effective FIR* filter within the readout process that aids the optimization of the horizontal MTF and the progressive vertical MTF and associated aliasing."
The summation of two greens is said to increase the DR from 70dB to 73.5dB in green (in fact, from 70.5 to 73.5) or to 72dB in luma (12 stops).

Although the camera frame rate is 24p fps, the readout speed is 1/60s to reduce rolling shutter effects:


In 60i mode each half-frame is read at 1/120s - same per-row speed as in 24p mode.

The low read noise is achieved by limiting the readout amplifier bandwidth, as shown below:


Another Canon whitepaper "RGB Resolution Considerations in a New CMOS Sensor for Cine Motion Imaging" shows advantages of the proposed green processing in resolution extension:

Showing the two separate green 1920 (H) x 1080 (V) photosite lattices
and the horizontal and vertical timing offsets between each of the
two “diagonal” pixels that are summed during the readout process

The resulting horizontal and vertical MTFs of the whole system are improved:


The summary says: "A new CMOS image sensor has been described. It represents a definitive decision by Canon to enter the global field of digital cinematic motion imaging. It is anticipated that there will be many progressive advances in the years ahead. Accordingly, a priority was assigned to taking a first step into this important field of imaging by placing an initial focus on originating a very high quality RGB video component set specifically intended for high-performance High definition video production."

Another whitepaper is titled "Sensitometric Characteristics of EOS C300 Digital Cine Camera" and mainly focused on system processing of the video signal, introduces "Canon-Log" response.

Samsung IEDM 2011 Paper

Eric Fossum put his Samsung IEDM 2011 paper on-line:

"A 192×108 pixel ToF-3D image sensor with single-tap concentric-gate demodulation pixels in 0.13 μm technology"
T.Y. Lee, Y.J. Lee, D.K. Min, S.H. Lee, W.H. Kim, S.H. Kim, J.K. Jung, I. Ovsiannikov,
Y.G. Jin, Y.D. Park, E.R. Fossum, and C.H. Chung

"A 3D-ToF FSI image sensor using novel concentric photogate [CG] pixels with single-tap operation is described. Through the use of CG structure, we are able to achieve high DC at larger pixel pitches. The new CG pixel structure substantially improves DC [demodulation contrast] to 53% at 20MHz at 28 μm pixel pitch. Recent initial results from a backside-illuminated (BSI) implementation of the same sensor show further improved performance and will be reported elsewhere."

Friday, December 30, 2011

Truesense Imaging Inc. and Digital Optics Corp.

As written in comments, the recently acquired Kodak Image Sensor Solutions has been quietly renamed to Truesense Imaging, Inc. Kodak has first used Truesense name for its W-RGB color filter products almost 3 years ago. I wonder if the new company name meant to emphasize the W-RGB products importance.

Meanwhile, Tessera renamed its imaging and optics division into Digital Optics Corporation. The new entity is responsible for wafer-scale optics (former Shellcase), EDoF (former Eyesquad and Dblur), MEMS AF motors (former Siimpel), micro-optics (the original bearer of Digital Optics Corporation name, acquired by Tessera in 2006) and image enhancement software (former Fotonation). It appears that the division has been renamed and separated into the wholly owned subsidiary in June 2011.

Another part of Tessera dealing with chip packaging is separated and renamed too. Its new name is Invensas. In Nov. 2011 Invensas acquired patent assets of California-based TSV foundry ALLVIA. It does not seem to target image sensor applications though.

1.8 Gigapixel Camera Deployed on Helicopter Drones

BBC, US Army: The A160 Hummingbird helicopter-style drones with 1.8 Gigapixel color cameras are being developed by the US Army promising "an unprecedented capability to track and monitor activity on the ground".

A statement added that three of the sensor-equipped drones were due to go into 1-year trial service in Afghanistan in either May or June 2012 as a part of a Quick Reaction Capability, an acquisition approach aimed at delivering cutting-edge and emerging technologies to theater. The army developers and engineers are now finishing up some wiring work on the A160 aircraft and performing ground tests with the ARGUS sensor suite.

Boeing built the first drones, but other firms can bid to manufacture others. The 1.8 Gigapixel ARGUS-IS camera is developed and manufactured by BAE Systems.


The army said that was enough to track people and vehicles from altitudes above 20,000 feet (6.1km) across almost 65 square miles (168 sq km). In addition, operators on the ground can select up to 65 steerable "windows" following separate targets to be "stared at".

DARPA is also working with the UK-based division of BAE Systems to develop a more advanced version of the Argus-IS sensor that will offer night vision. It said the infrared imaging sensors would be sensitive enough to follow "dismounted personnel at night". In addition, the upgrade promises to be able to follow up to 130 "windows" at the same time. The system's first test flight has been scheduled to take place by June 2012.

Thanks to CDM for the link!

Thursday, December 29, 2011

Digitimes: Samsung and Sony to Supply Sensors for Next Generation iPads

Digitimes quotes its sources saying that next generation iPad 3 would be released in two versions. The high end version will feature 8MP camera with Sony sensor. As for the mid-range model, Samsung is said to be among the suppliers of its 5MP sensor.

The new iPad 3 tablets are to be announced at iWorld on Jan. 26, 2012, according to the newspaper. The original version of iPad was announced on Jan. 27, 2010, while the iPad 2 was first shown on March 2, 2011.

Microsoft Proposes Double Helix PSF for Depth Sensing

Microsoft patent application US20110310226 "Use of wavefront coding to create a depth image" by Scott McEldowney proposes a fresh idea to acquire image depth information.

Here is the original description:

"[A] 3-D depth camera system includes an illuminator and an imaging sensor. The illuminator creates at least one collimated light beam, and a diffractive optical element receives the light beam, and creates diffracted light beams which illuminate a field of view including a human target. The image sensor provides a detected image of the human target using light from the field of view but also includes a phase element which adjusts the image so that the point spread function of each diffractive beam which illuminated the target will be imaged as a double helix. [A] ...processor ...determines depth information of the human target based on the rotation of the double helix of each diffractive order of the detected image, and in response to the depth information, distinguishes motion of the human target in the field of view."

Actually, it's much easier to understand this idea in pictures. Below is the illuminator with a diffractive mask 908:


There is another mask 1002 on the sensor side:


Below is the proposed double-helix PSF as a function of distance. One can see that the two points line angle changes as a function of depth:


The orientation angle of the PSF points depends on wavelength (not shown here, see in the application) and the distance (shown below):


From this angle the object distance can be calculated - this is the idea. Microfoft gives an image example and how it changes with the distance in what looks like Wide-VGA sensor plane:





Update: As written in comments, University of Colorado, Denver has been granted a patent US7705970 on a very similar idea. A figure in the patent looks very similar:

Tuesday, December 27, 2011

1/f and RTS Noise Reduction

As mentioned in Theses post, Oregon State University published Drake A. Miller's PhD Thesis "Random Dopants and Low-Frequency Noise Reduction in Deep-Submicron MOSFET Technology". The thesis is quite rich in experimental data os pixel source follower noise. The figure below shows more than order of magnitude variations in 1/f noise across the wafer:

Noise spectral power plots of 10 devices taken from
10 different locations across the wafer (see inset).

Any channel doping, such as Vth adjust, significantly increases 1/f and RTS noise:

Box plots of source follower noise power spectrum plots.
Red (Dark) boxes are doped devices.
Green (Light) boxes are undoped “native” transistors.

Few Vth adjust splits were measured:


It's not clear why S4 and S7 are not shown, but S1-S3 clearly show noise improvement:


The total read noise histogram clearly demonstrates the advantage of lightly doped source follower:


RTS Statistics shows the same:

Photons to Bits and Beyond Presentation On-Line

Eric Fossum published the pdf notes of his lecture "Photons to Bits and Beyond. The Science and Technology of Digital Imaging".

Monday, December 26, 2011

Recent Image Sensor Theses

There are few recently published image sensor theses:

"Pixel and Readout Circuit of a Wide Dynamic Range Linear-Logarithmic Current-Mode Image Sensor"
MS Thesis by Elham Khamsehashari, Aug. 2011
ÉCOLE POLYTECHNIQUE DE MONTRÉAL

"This thesis presents a current-mode CMOS image sensor operating in linear-logarithmic response. The objective of this design is to improve the dynamic range of the image sensor, and to provide a method for mode detection of the image sensor response. One of the motivations of using current-mode has been the shrinking feature size of CMOS devices. This leads to the reduction of supply voltage which causes the degradation of circuit performance in term of dynamic range. Such problem can be alleviated by operating in current-mode. The column readout circuits are designed in current-mode in order to be compatible with the image sensor. The readout circuit is composed of a firstgeneration current conveyor, an improved current memory is employed as a delta reset sampling unit, a differential amplifier as an integrator and a dynamic comparator."

"Single Shot High Dynamic Range and Multispectral Imaging Based on Properties of Color Filter Arrays"
MS Thesis by Paul M. Simon
UNIVERSITY OF DAYTON, May 2011

"This paper addresses the difficulty of generating High Dynamic Range (HDR) images using current Low Dynamic Range (LDR) camera technology. Typically, several LDR images must be acquired using various camera f-stops and then the images must be blended using one of several exposure bracketing techniques to generate HDR images. Based on Fourier analysis of typical Color Filter Array (CFA) sampled images, we demonstrate that the the existing CFA sampled images provide information that is currently underutilized. This thesis presents an approach to generating HDR images that uses only one input image while exploiting that underutilized CFA data. We propose that information stored in unsaturated color channels is used it to enhance or estimate details lost in saturated regions."

One must note that the DR extension is not that big and is based on the assumption that not all colors saturate simultaneously.

"Analysis, Modeling and Dynamic Optimization of 3D Time-of-Flight Imaging Systems"
PhD Thesis by Mirko Schmidt
Ruperto-Carola University of Heidelberg, Germany, July 2011

"This thesis covers four main contributions: A physical sensor model is presented which enables the analysis and optimization of the process of raw image acquisition. This model supports the proposal of a new ToF sensor design which employs a logarithmic photo response.
Due to asymmetries of the two read-out paths current systems need to acquire the raw images in multiple instances. This allows the correction of systematic errors. The present thesis proposes a method for dynamic calibration and compensation of these asymmetries. It facilitates the computation of two depth maps from a single set of raw images and thus increases the frame rate by a factor of two.
Since not all required raw images are captured simultaneously motion artifacts can occur. The present thesis proposes a robust method for detection and correction of such artifacts.
All proposed algorithms have a computational complexity which allows real-time execution even on systems with limited resources (e.g. embedded systems). The algorithms are demonstrated by use of a commercial ToF camera.
"

"Random Dopants and Low-Frequency Noise Reduction in Deep-Submicron MOSFET Technology"
PhD Thesis by Drake A. Miller
Oregon State University, March 2011

Quite significant RTS and 1/f noise reduction in image sensors has been reported:
"In the case of this research it was shown that once the noise source and mechanism was understood necessary steps could be taken to reduce the source of the noise. Two examples shown here are the impact of substrate bias and modification of the doping levels. Substrate biasing is a relatively straight forward approach to reducing the noise and has been show here to have this repeatable effect. With additional understanding of the percolation currents modification of the channel dopant profile can serve as an additional means for device noise improvement. Once understood, these relatively easy steps, as in the case of reducing the implant dose in the channel, verified the theory and model developed during this research and resulted in a superior performing CMOS image sensor
product.
"

Thursday, December 22, 2011

e2v Applies for Electron Multiplying CMOS Sensor

e2v applies for a patent extending its EMCCD technology to the realm of CMOS sensors: "Electron multiplication image sensor and corresponding method" by Frédéric Mayer (France). Fig. 1 of the US20110303822 application shows a prior art 4T pixel having a pinned photodiode PHD:


e2v proposes to split the PHD into two with the "accelerating gate" GA in between, as on Fig. 2. By applying multiple voltage pulses on GA the electrons can be moved in and out of it, as shown on Fig. 3.

"The electron multiplication takes place during the charge integration and in the photodiode itself in the sense that the electrons (photogenerated or resulting already from the impacts of carriers with atoms) are accelerated in turn from the photodiode towards the accelerating gate and from the accelerating gate towards the photodiode. During these movements, impacts with atoms of the semiconductor layer of the photodiode region or of the region located beneath the accelerating gate make other electrons in the valence band pass into the conduction band. These electrons lose energy during these impacts but they are again accelerated by the electric field that is present.

The number of alternations in potential applied to the accelerating gate defines the overall multiplication coefficient obtained at the end of an integration period T, i.e. between two successive pulses for transferring charge from the photodiode to the charge storage region.
"

Fig. 4 shows one of the possible pixel layouts with GA located in the middle of PHD.

Update: As said in comments, in 2009 Sanyo published a different idea of electron multiplying CMOS pixel. The idea is shown on the figure below:


Update #2: As EF said in comments, Sanyo presented its electron multiplying sensor at ISSCC 2009 (paper, presentation). The pixel structure and the gain non-uniformity are taken from the presentation slides:


Wednesday, December 21, 2011

Fujifilm Organic Sensor Article

DPReview published an article on Fujifilm's organic image sensor patents. The article also quotes Eric Fossum on Fujifilm pixel and other new thin film pixel approaches.

Blocked Holes Can Enhance Light, Rather than Stop it

Physorg.com, Optics InfoBase: "Many optical systems today, such as those in sensing, nanolithography, and many others, are built on a general belief: An optically opaque metal film would block light transmission even if the film has small holes, as long as the holes are covered with opaque metals which geometrically block the light path through the holes. For example, light transmission from one side of a glass to the other side is assumed to be blocked, when an opaque metal film is coated on one surface of the glass, even if the surface unavoidably has tiny dusts. This is because the coated metal covers the dust completely, hence blocking the light geometric path through the dust. Here, we report our experimental and theoretical study that demonstrates otherwise: Not only the light can transmit, but also the transmission is greatly enhanced, which is much better than an open hole. Furthermore, we found the transmission can be tuned by the metal blocker’s geometry and by the gap between the blockers and the metal film."


These electron microscope images show an experiment in which Princeton Professor of Engineering Stephen Chou showed that blocking a hole in a thin metal film could cause more light to pass through the hole than leaving the hole unblocked. The top image shows an array of 60nm holes spaced 200nm apart with gold caps, each of which is 40 percent bigger than the hole on which it sits. The bottom image shows a cross-section view of one hole with the cap sitting on top of SiO2 pillar. The gold film in the experiment was 40nm thick. The hole covered with the cap surprisingly allows 70% more light to be transmitted through the film than a hole without the cap, Chou's research team found.

"We did not expect more light to get through," Chou said. "We expected the metal to block the light completely."

Chou said the metal disk acts as a sort of "antenna" that picks up and radiates electromagnetic waves. In this case, the metal disks pick up light from one side of the hole and radiate it to the opposite side. The waves travel along the surface of the metal and leap from the hole to the cap, or vice versa depending on which way the light is traveling. Chou's research group is continuing to investigate the effect and how it could be applied to enhance the performance of ultrasensitive detectors.

Comparison of transmittance measurements showing 70% transmission enhancement by the blocked hole array than the open hole array.
(a) Experimental transmittance spectra measured on a periodic gold hole array blocked by Au nanodisks and the same gold hole array after removal of the nanodisks. The hole array has a hole diameter of 70 nm and a gold thickness of 40 nm, the gold nanodisks have a diameter of 85 nm, and the SiO2 pillar height is 52 nm.
(b) Plot of transmission enhancement ratio calculated by dividing the optical transmission of blocked and open gold hole arrays. A maximum enhancement of 1.7x is observed at 680 nm.

Thanks to JM for sending me the link!

Friday, December 16, 2011

FPN Measurements Wrap Up

Albert Theuwissen wraps up FPN measurement series of posts. The concluding post adds all the previously discussed numbers and graphs together.

Eedoo Game Console Postponed Again

Penn Olson reports that China Eedoo iSec console release is pushed back again. Eedoo’s CEO tells Sina Tech there’s no scheduled release date now. The console is featuring by Softkinetic ToF 3D sensor and supposed to compete with Xbox Kinect.

Eedoo is now negotiating its 3rd round of financing. The company's CEO said that next year the company's total investment in the iSec project will reach 100M yuan ($15M).

Update: PC World: Eedoo has pushed back its launch date again to some time later in 2012, said Eedoo spokesman Victor Wang on Monday. A source close to the situation however said on condition of anonymity that the launch of the product may be delayed further as the product was not found to be robust enough.

Thursday, December 15, 2011

ST Pixel Simulation Paper

ST presented "3D TCAD Simulation of Advanced CMOS Image Sensors" paper at 2011 International Conference on Simulation of Semiconductor Processes and Devices (SISPAD 2011) held in Osaka, Japan, in Sept. 2011. The paper is written by Z. Essa, P. Boulenc, C. Tavernier, F. Hirigoyen, A. Crocherie, J. Michelot, D. Rideau.

The 2.5D vs 3D process and device simulations are compared based on Synopsys Sentaurus simulator. Also Lumerical FDTD simulator was used for the optical part. The 1.4um FSI pixel simulations show 3D Qsat of 4200e- while it is 5800e- in both measurements and 2.5D simulations.

The discrepancy exists also between the simulated and measured QE:


The paper conclusion is that "further simulations calibration adjustments are required to match experimental Qsat and QE".

Another interesting paper is "Modeling Statistical Distribution of Random Telegraph Noise Magnitude" by Ken’ichiro Sonoda, Motoaki Tanizawa, Kiyoshi Ishikawa, and Yasuo Inoue from
Renesas.

Wednesday, December 14, 2011

MIT Camera Capable of 1.71ps Exposure

The device has been developed by the MIT Media Lab’s Camera Culture group in collaboration with Bawendi Lab in the Department of Chemistry at MIT. A laser pulse that lasts less than one trillionth of a second is used as a flash and the light returning from the scene is collected by a camera at a rate equivalent to roughly half a trillion frames per second. However, due to very short exposure times (roughly two trillionth of a second) and a narrow field of view of the camera, the video is captured over several minutes by repeated and periodic sampling. The new technique is able to compose a single 2D movie of roughly 480 frames each with an effective exposure time of 1.71 picoseconds.

MIT's Youtube video shows the camera in work:



Thanks to RC and CDM for sending me the link!

ISAE Publications On-Line

Institut Supérieur de l’aéronautique et de l’espace (ISAE) Image Sensor Research Team (CIMI) kindly made its publications available on-line.

Thanks to VG for sending me the link!

ISORG Demos Magic Pad

Printed organic sensor company ISORG demos its first product in these Youtube videos:

This bar graph applet illustrates what the ISORG organic sensors "sees" on top of the tablet, it estimates a relative distance to the object on top the pad by the amount of light each pixel receives:



Virtual disc with a touch-less interface:



Thanks to SG for the link!

Update: Here is Photonics 21, Sept. 2011 presentation by ISORG.

Tuesday, December 13, 2011

Caeleste Presents its 0.5e- Noise Pixel and More

As BD pointed in comments, Caeleste publications page has been updated to include the latest CNES Workshop 2011 presentations. The most interesting one is "A 0.5 noise electrons CMOS pixel" by Bart Dierickx, Nayera Ahmed, and Benoit Dupont. The presentation explains the 1/f and RTS noise reduction principle by cycling the pMOSFET between accumulation and inversion:


150 inversion-accumulation cycles are averaged to reduce pixel noise down to 0.5e level:


The result was measured on the technology demonstrator based on 100um standalone test structure, ~7μm MOSFET area, pixel is used in CTIA mode with >1000μV/e- conversion gain:


Another new CNES Caeleste presentation is "High QE, Thinned Backside-Illuminated, 3e- RoN, Fast 700fps, 1760×1760 Pixels Wave-Front Sensor Imager with Highly Parallel Readout."

Sensors and Space

Teledyne DALSA announces the NASA-designed, DALSA-manufactured CCDs are embedded in the Engineering Cameras of the Mars Curiosity Rover, launched on Saturday, November 26, 2011. The Engineering Cameras, known as the Navcam and Hazcam cameras, are located on the Mars Science Laboratory (MSL) Rover and are used for navigation on the surface of Mars. The Rover will use 4 Navcam cameras and 8 Hazcam cameras.

Navcams (Navigation Cameras) are B&W stereo cameras using visible light to gather panoramic, 3D imagery for ground navigation planning by scientists and engineers. Hazcams (Hazard Avoidance Cameras) are B&W cameras using visible light to capture 3D imagery to safeguards against the rover getting lost or inadvertently crashing into unexpected obstacles, and works in tandem with software that allows the rover to make its own safety choices and to "think on its own."

Teledyne DALSA also announced it will partner with Surrey Satellite Technology Limited (SSTL) to develop a new multispectral sensor for an advanced earth observation application. The multimillion dollar development project is expected to begin delivering high resolution images during 2014 for applications such as urban planning and environment and disaster monitoring. Custom multispectral sensors to be designed and manufactured by 2013:

Monolithic multispectral imagers--3, 4, 5 or more different imaging areas on one chip

e2v has signed a multi-million dollar contract for a 2 year program to supply the complete 1.2 Giga-pixel camera system for the Javalambre Physics-of-the-Accelerating-Universe Astrophysical Survey (J-PAS) project funded by a consortium of Spanish and Brazilian astronomy institutes. J-PAS will be dedicated to creating a map of the observable Universe in 56 continuous wavebands from 350nm to 1000nm. The e2v cryogenic camera system has a 1.2 gigapixel mosaic array capable of being read out in 10 seconds.

The camera will be designed and built by e2v, will use 14 newly developed CCD290-99 sensors and includes a guarantee of the camera’s performance levels and a commercial warranty. The 85MP CCDs will be back-thinned and given a multi-layer, anti-reflection coating. They are a 9k x 9k pixel format, with multiple outputs for rapid readout times, and are mounted in a precision package to allow them to be assembled into a mosaic, providing an image area that is nearly 0.5m in diameter. The focal plane assembly will also include the telescope guide and wavefront sensors. The whole focal plane will then be contained in a custom cryogenic camera, with vacuum and cooling components and integrated electronics which will provide state-of-the-art low noise for maximum sensitivity.

e2v has also signed a multi-million Euro contract with Thales Alenia Space for the design, development and manufacture of a space qualified CMOS imaging sensor for use in the Flexible Combined Imager (FCI) instrument of the Meteosat Third Generation (MTG), an ESA and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) program. The first MTG-I satellite is expected to be launched in 2017, with the first MTG-S following in early 2019.

Monday, December 12, 2011

Image Sensor 2012 Program

Formerly known as Image Sensors Europe, please see below a preliminary program of Image Sensors 2012 Conference which will take place in London, UK in March 20-22, 2012:

KEYNOTE
Quanta Image Sensor (QIA QIS) – a possible paradigm shift for the future
Dr Eric R. Fossum, Solid-State Image Sensor Device Physicist and Engineer, and Professor of Engineering, THAYER SCHOOL OF ENGINEERING AT DARTMOUTH, US
  • Review of 1st and 2nd generation solid-state image sensors
  • Issues with physics limitations – shot noise, diffraction limit, electronics noise
  • Issues with future sensor implementation – pixel size, pixel count, read noise, full well, QE, color separation
  • The Quanta Image Sensor concept and technical challenges
  • New paradigm for image formation from QIS

KEYNOTE
Dark Current and White Blemish
Dr Nobukazu Teranishi, General Manager Image Sensor Technology, Image Sensor Business Unit, Semiconductor Company, PANASONIC CORPORATION, Japan
  • Why are they always important?
  • Stresses
  • Getterings
  • Effect and limitation of the pinned photodiode
  • Recent approaches

KEYNOTE
High performance imaging detectors for space astrophysics
Dr Samuel Harvey Moseley, Senior Astrophysicist, Laboratory for Observational Cosmology, NASA/GODDARD SPACE FLIGHT CENTER, US
  • Science goals of space astrophysics
  • Performance requirements for detectors in space applications
  • Unique problems for operating detectors in the space environment
  • Calibration requirements for detectors in space astrophysics
  • Detection systems required by future astrophysics missions

Next generation camera phones - trends and technologies
Mats Wernersson, Master Engineer - Camera Research, SONY ERICSSON, Denmark
  • Trends in the usage of cameras in mobile phones
  • Consumer needs and technology deliverables
  • What direction would we like to see taking shape?
  • Disruptive technologies and innovations which may alter future sensor usage
  • Challenges for the image sensor industry

Overview of Pixel Development for Mobile, Automotive, Industrial, and high-end DSC Applications
Dr Gennadiy Agranov, VP Imaging Technology, APTINA IMAGING, US
  • Small pixel development including latest results on BSI pixel development
  • Global shutter pixel development
  • Large pixel development for automotive, industrial, and high DSC markets
  • Pixel development for 3D and depth sensing

The impact of LED lighting on colour rendition in TV and other cameras
Richard Salmon, Lead Research Engineer, BBC RESEARCH & DEVELOPMENT, UK
  • How the camera sensor is different from the human eye
  • Why new LED lighting is so different from traditional studio lighting
  • Why the Colour Rendering Index, hitherto used to assess the quality of lighting, is unsuitable for quantifying lighting to be used with cameras
  • Recommendations for a TV Lighting Consistency Index
  • What factors camera sensor manufacturers might need to consider

Development of 33Mpixel 120fps CMOS image sensor for full-spec Super Hi-vision
Dr Hiroshi Shimamoto, Senior Research Engineer, NHK (JAPAN BROADCASTING CORPORATION) Science & Technology Research Laboratories, Japan
  • World’s first image sensor that covers spec requirements for “full-spec” Super Hi-vision
  • High output data rate as more than 51Gbps
  • Low power consumption as less than 2.5W
  • Newly developed on-chip ADC technology
  • Experimental image acquisition system for this sensor is under development

Trends in gesture recognition technology
Daniel Van Nieuwenhove, CTO, SOFTKINETIC, Belgium
  • Recent developments in gesture recognition technology: a review of progress to date
  • Trends in 3D time of flight cameras and gesture recognition middleware
  • A review of current and potential future applications
  • Challenges and opportunities

To Be Announced
Dr Eiichi Funatsu, Senior Manager - Sensor Division, Semiconductor Business Group, SONY CORPORATION, Japan

Challenges and opportunities in medical endoscopic imaging
Dr Koichi Mizobuchi, Deputy General Manager, Imaging Technology Development, OLYMPUS MEDICAL SYSTEMS CORPORATION, Japan
  • History of medical endoscopy
  • Application of medical endoscopy
  • Optical light filtering technologies
  • Capsule endoscope
  • Requirements for future image sensors

Defence / military sector & future challenges for image sensors
Oscar d’Almeida, Director, Detectors Technology, R&T/Optronics and Defence Division, SAFRAN SAGEM, France
  • Main objectives for military capabilities
  • Military/defence equipments needs and requirements
  • State of the art of imaging sensors
  • Main challenges
  • Equipment: Integration and trade-off

A plenoptic image sensor for monocular 3D cameras
Dr Christian Perwass, Managing Director, RAYTRIX GmbH, Germany
  • Introduction to the image generation concept behind plenoptic cameras
  • Image generation and depth calculation from plenoptic images
  • Microlens array design for high effective spatial and depth resolution
  • Measurements and theoretical capabilities of Raytrix plenoptic cameras
  • Application examples

CMOS image sensors for non-visible applications: seeing the invisible
Dr Renato Turchetta, CMOS Sensor Design Group Leader, RUTHERFORD APPLETON LABORATORY – STFC, UK
  • Review of applications in IR, UV, X-ray
  • What is different from visible light applications:
    Sensing
    Processing
  • Enhancing the sensitivity of silicon by coupling it to scintillators
  • A 16Mpixel sensor for high-resolution X-ray imaging
  • A wafer-scale sensor for X-ray medical imaging

Image sensors for photogrammetry and orthophoto production
Udo Tempelmann, Manager System Engineering, LEICA GEOSYSTEMS, Aerial Imaging, Switzerland
  • The challenge of highest resolution and area coverage under natural light conditions
  • How existing image sensors meet the challenge
  • Future technology requirements for aerial image sensor
  • Consumer level photogrammetric products and how they relate to classical photogrammetry
  • Synergy effects of combinations with other sensors than the classical silicon photo diode

Near- photon counting CMOS pixel
Bart Dierickx, Founder, CAELESTE, Belgium
  • Photon counting, near photon counting and classic charge integration imaging
  • How low in noise can one and must one go?
  • Caeleste’s concept to cancel 1/f and RTS noise
  • The impact of 1/f noise cancellation on overall read noise
  • Image sensor specifications

Drivers shaping healthcare imaging
Dr Martin Spahn, Senior Manager Imaging Technologies, SIEMENS, Germany
  • X-ray image sensor technology currently being used to meet patient's needs within the sector - case studies
  • Future technology requirements
  • Research and development work which will shape future applications
  • Challenges to consider

High QE, Thinned Backside-Illuminated, 3e- RoN, Fast 700fps, 1760x1760 Pixels Wave-Front Sensor Imager with Highly Parallel Readout
Mark Downing, CCD Specialist - Optical Detector Team, EUROPEAN SOUTHERN OBSERVATORY, Germany
  • The success of the next generation of extremely large ground-based optical telescopes (E-ELT, GMT, and TMT) will depend upon improving the image quality (correcting the distortion caused by atmospheric turbulence) by deploying sophisticated Adaptive Optics (AO) systems
  • One of the critical components of the AO systems for the E-ELT has been identified as the wavefront sensor detector
  • The combinations of large array size, 1760x1760 pixels needed to account for the elongation of laser guide stars (LGS), the fast frame rate of 700 (up to 1000) frames per second, the required high QE (90%), and low read out noise of 3e-makes the development of such a device extremely challenging
  • A CMOS Imager is under development with a highly parallel read out architecture consisting of over 60,000 on-chip ADCs and 88 parallel high speed LVDS ports to achieve the low read out noise at the high pixel rates of ~3 Gpixel/s (~30 GBit/s). The Imager will be thinned and backside illuminated to reach the 90% QE
  • This talk reports on the development of the full size Imager and results of Technology Demonstrators

Overview of the healthcare sector & its future requirements from image sensors
Dr Dimitra G. Darambara, Team Leader, Multimodality Molecular Imaging, Joint Department of Physics, Radiotherapy and Imaging Division, ROYAL MARSDEN NHS FOUNDATION TRUST & INSTITUTE OF CANCER RESEARCH, UK
  • Drivers shaping the healthcare sector
  • Review of state-of-the-art image sensors currently being used within the sector
  • Challenges, demands and trends for future image sensor technology
  • Quantitative imaging beyond visual interpretation
  • New concepts in medical imaging
  • Innovation vs clinical need: translation from concept to clinical practice

Intelligent video for security and increased operational efficiency
David Dorn, Applied Technologies Manager, PELCO by SCHNEIDER ELECTRIC, US
  • Video technology advances in sensors, analytics, and networks
  • State-of-the-art video security applications and systems
  • Multi-spectral imaging systems
  • Integration of sensor networks for improved analytics accuracy
  • Beyond Security - Operational uses of video to improve enterprise efficiency, reduce energy costs, and improve human safety

Imaging technologies and market trends
Dr Valerie Nguyen, Imaging Business Development Manager, CEA-LETI, France
  • Image sensors today: from image quality to sensing functions
  • Market trends and pixels everywhere
  • Multispectral imaging and key technology enablers
  • Technology perspectives

An image of security – trends and developments in modern security imaging
Ian Crosby MinstP, Head of Product Management, Imaging Competence Centre Eindhoven, BOSCH SECURITY SYSTEMS, Netherlands
  • The decurity imaging market
    Figures
    Applications
  • The challenges associated with security imaging
  • The early days
  • Modern imaging and intelligence

Hiroaki Fujita Passed Away

Albert Theuwissen posted that Hiroaki Fujita passed away. Hiro used to work as pixel and process engineer for Sony, Kodak, Aptina, and, briefly, for Panasonic. He was in the group of Kodak engineers received 2009 Walter Kosonocky Award for RGB-W sensor with 1.4um pixel with p-type photodiode.

Saturday, December 10, 2011

Intel to Support Gesture Recognition in its Chipsets

Reuters: Total Immersion, a pioneer in emerging field of augmented reality (AR), is working with Intel to bring AR features, like gesture recognition, into Intel's chipsets, Total Immersion's marketing chief Antoine Brachet said.

"What we are doing together with Intel is working on their chipset ... so inside the chipset you can have some AR features, like gesture recognition that can be transferred from software to hardware," Brachet said.

Intel Capital has invested in Total Immersion $5.5M in March 2011. The 1999-founded (other sources - 1998-founded) Total Immersion has lately attracted attention of other heavyweights, Google and Qualcomm, according to Reuters.

Friday, December 09, 2011

GSA: Ambarella and Aptina - Most Respected Private Companies

Venture Beat reports from Global Semiconductor Association’s annual awards dinner today in Santa Clara, CA that the most respected private company award nominees included Ambarella, Aptina, and SandForce. Ambarella won the award.

The GSA awards page explains the award:

"The industry's Most Respected Private Semiconductor Company award is designed to identify the private company garnering the most respect from the industry in terms of its products, vision and future opportunity. GSA's Awards Committee reviews all private semiconductor companies, and the selected nominees and winner are based on the committee's analysis of each company's performance and likelihood of long-term success."

Update: Ambarella: "It is an honor to receive the GSA’s 2011 award for Most Respected Private Semiconductor Company, our fourth GSA award and our second back-to-back award in this category," said Fermi Wang, CEO of Ambarella.

EMCCD Lecture

RIT Center for Detectors published a nice EMCCD lecture by Craig Mackay, Institute of Astronomy, University of Cambridge, UK. The lecture covers main principles and applications of EMCCDs, primarily produced by e2v. Some slides from the lecture:

Introduction to EMCCDs: General Characteristics
  • EMCCDs are standard CCDs plus an electron multiplication stage.
  • EMCCDs may be read out at high pixel rates (up to 30 MHz for E2V EMCCDs, probably up to 60 MHz for TI EMCCDs).
  • The gain mechanism increases the variance in the output signal so that the signal-to-noise ratio goes as √(2N) rather than √(N).
  • Equivalent to halving the detective quantum efficiency.
  • Photon counting can substantially restore this effective loss in quantum efficiency.
  • Clock induced charge (CIC) affects all CCDs, but you will only really notice it with high gain EMCCDs.

When Should You Use EMCCDs?
  • Any time that you are really limited by readout noise.
  • This is more likely to be the case when running with high pixel rates. Conventional CCDs give excellent read noise but only at low read-out rates.
  • Recent developments in sCMOS technology are changing this by offering low read-out noise (~1-2 electrons) and 100 Hz frame rates, though best in rolling shutter mode.
  • However, do not forget the equivalent loss in detector quantum efficiency using an EMCCD in analog mode.
  • At the lowest signal levels, photon counting gives close to the theoretical full DQE at high frame rates.

EMCCD Ageing
  • Illuminating the image area to saturation and running the device at a gain of 1000x will cause the device to fail within a few hours.
  • In practice these devices will be used at much lower illumination levels.
  • With reasonable care in system design, many years of operation will be obtained.
  • The ageing effect is seen as an increase in the high-voltage (multiplication) clock level needed to achieve a specific gain.
  • At low gains (gains of a few) no ageing is seen.
  • Increasing the gain from 100x to 1000x roughly doubles the short-term ageing rate with little effect on the longterm ageing.
  • The ageing is principally caused by excessive signal levels in the multiplication register.
  • An increase of 5 V over the life of the device is about the limit before failure occurs.

Thursday, December 08, 2011

Sharp Slim 12MP Module Demo

Wired claims that the recently announced 5.47mm-slim Sharp camera module equippped with 12.1MP BSI sensor is already on the market, embedded inside Sharp AQUOS SH-01D smartphone. Wired shows a nice Youtube demo of OIS in this module:

2nd Day of Image Sensor of High Performance Applications Workshop

Albert Theuwissen published a report on the 2nd day of CMOS Detector Workshop in Toulouse, France. The report was written by Mukul Sarkar and covers Caeleste 0.5e- noise sensor, among other papers:

CMOS image sensor pixel with 0.5 noise electrons RMS – CAELESTE
  • The RTS noise and 1/f noise is reduced by cycling the MOSFET between inversion and accumulation to produced un-correlated noise which when sampled become “white”.
  • A CTIA configuration was used and a very high conversion gain of nearly 1000uV/e- was reported.
  • When the cycling of the MOSFET was not used a 2e- readout noise was obtained. While when the cycling was performed a 0.5e- readout noise at dark was measured. However it mentioned that the measurements showed variance and it might be due to the CVF. The research institutes and PhDs were invited to do an independent confirmation!

Wednesday, December 07, 2011

1st Day of CMOS Image Sensors for High Performance Applications Workshop

Albert Theuwissen published a report from the 1st day of CMOS Detector Workshop being held in Toulouse, France these days. The report was written by M. Sarkar and covers most of the papers presented on the workshop. The workshop is geared toward space and scientific applications. The full list of the workshop papers can be found here.

e2v Sensors Power NASA Curiosity

e2v: On November 26, e2v sensors were launched into space onboard an Atlas V rocket as part of NASA’s Mars Science Laboratory mission, which plans to land a rover named “Curiosity” on the surface of Mars as part of NASA’s Mars Exploration Programme.

The Mars Science Laboratory is a long-term robotic exploration to assess if Mars is, or ever has been, an environment that can support life. It will be the biggest, most capable robot to ever land on another planet. e2v imaging sensors equip both the rover’s Chemistry and Mineralogy instrument (CheMin) which was developed by NASA’s Jet Propulsion Laboratory (JPL) and the Chemistry & Camera instrument (ChemCam) which was developed by the Los Alamos National Lab under an agreement with NASA’s JPL. CheMin will identify and measure the minerals on the planet using sophisticated x-ray detection techniques. The ChemCam instrument consists of a laser, which will be used to vaporise rock samples, and a camera which will then use Laser Induced Breakdown (LIB) spectroscopy to analyse the material produced.

CheMin uses the e2v CCD224, a specialised imaging sensor array optimised for the detection of x-rays in a space environment. This high performance imaging sensor is based upon technology originally implemented in the European Space Agency’s XMM-Newton X-Ray observatory, where it has been operating successfully in the EPIC Instrument for the last 10 years. CheMin will expand the use of e2v’s x-ray imaging sensor technology to the Martian surface.

ChemCam uses the e2v CCD42-10 which is part of a standard range of imaging sensors used for various commercial and high performance applications including ground and space borne astronomy, and spectroscopy. The variant used in ChemCam was back-thinned to maximise sensitivity and coated with a custom graded anti-reflection coating to match the spectroscopic requirements of the mission.

Mars Science Laboratory using laser instrument, artist's concept - courtesy of NASA/JPL-Caltech

Tuesday, December 06, 2011

Albert Theuwissen Reports from IEDM 2011

Albert Theuwissen published a very interesting review of IEDM 2011 papers:
  1. Extremely-low noise CMOS image sensor with high saturation capacity, by K. Itonaga (Sony).
  2. High performance 300 mm backside illumination technology for continuous pixel shrinkage, by D. Yaung (TSMC).
    "QE values for a 0.9 um pixel were shown : 50 % in blue, 47 % in green and 45 % in red. The pixels were realized in a 65 nm process with a remaining thickness of the silicon equal to 2 um … 4 um. In the case of the 0.9 um pixel, the optical cross-talk is about 4 times as large as in the 1.1 um version."
  3. A 1.4 um front-side illuminate image sensor with novel light-guiding structure consisting of stacked lightpipes, by H. Watanabe (Panasonic).
    "QE in green 74 % in comparison with 69 % for the BSI and 43 % for the FSI without stacked lightpipe."
  4. Investigation of dark current random telegraph signal in pinned photodiode CMOS image sensors, by V. Goiffon (ISAE).
  5. A CMOS compatible Ge-on-Si APD operation in proportional and Geiger modes at infrared wavelengths, by. A. Sammak (TU Delft).
  6. Enhanced angle sensitive pixels for light field imaging, by S. Sivaramakrishnan (Cornell University).
  7. A 192×108 pixel ToF-3D image sensor with single-tap concentric-gate demodulation pixels in 0.13 um technology, by T.-Y. Lee (Samsung).

Update: Moving EF's comments to the front page:

1. The Sony paper was an embarrassment for Sony. The presenter who claimed significant noise reduction and increased saturation charge could not answer the question as to what the actual value of noise and saturation signal was, saying he was not knowledgeable about the details. This is shameful in a conference like IEDM.

2. TSMC said that they had a lot of particulate problems in wafer bonding leading many manufacturing issues, including wafer distortion during alignment and "breaking bubbles" etc. They said they were able to now reduce particulates to a lower level. This speaker was good about answering questions so a plus for TSMC. He also called the process "TSMC BSI" and not by his co-authors' company Omnivision. Interesting.

3. A very nicely presented paper although the 74% QE number is somewhat hard to accept. If true, it is remarkable. Reminds me that Aptina has also said lightpipes with FSI makes BSI less necessary. Watanabe says at 1.1 um, FSI might be comparable to BSI using the lightpipe technology. He is not sure about 0.9 um. Personally this is on own my short list for WKA nominations.

4. Interesting investigation of blinking pixels. Location of the traps were not determined but there was a lot of interesting statistical data presented.

5. Very interesting results. Too bad we cannot really work on this in the open in the US.

6. Enthusiastically presented student paper.

7. Well, I found this paper interesting, in as much as it is my device but with a life of its own at Samsung. When AT says "pretty good" performance, I think he means "pretty much SOA" performance. Note also that normal two-tap operation throws away 50% of the light (because you also need the quadrature signals) so losing 75% in single-tap is not as bad as it sounds. It is only a ~30% loss in SNR compared to other techniques, but the reduction in FPN makes up for the SNR loss when it comes to determining depth accuracy.

Pixelplus Reports Q2, Q3 Results, Withdraws Pink Sheets

PR Newswire: Based on unaudited results of operations in accordance with Korean GAAP on a non-consolidated basis, Pixelplus revenues for Q2 and Q3 2011 were US$10.5M and US$10.4M respectively, compared to US$5.8M and US$5.9M in the Q2 and Q3 a year ago. Net incomes in Q2 and Q3 2011 were US$2.8M and US$2.4M compared to a net income of US$0.6M and US$1.4M a year ago. Gross margins for Q2 and Q3 2011 were 40.1% and 39.4%, compared to 36.8% in Q1 2011.

"We continue to design and introduce cutting-edge products and technologies and release to the market other innovative technologies," said S.K. Lee, CEO and Founder of Pixelplus. "For this purpose, we continue to develop our core strategic business for automobile, security and surveillance applications, and positively collaborate with medical endoscope manufacturers in South Korea as well as key distributors and manufacturers in China, Hong Kong, Taiwan, and Japan. In parallel, we continue to vigorously pursue cost-control measures and are encouraged that we continue to effectively manage our operating expenses on a reliable and consistent basis."

PR Newswire: Pixelplus announces that it will terminate its American Depositary Receipts (ADR) Program. The company's ADRs will continue to trade over the counter (OTC) in the US until the date of termination of the Deposit Agreement on February 29, 2012.

"We are inclined to terminate the Deposit Agreement and ADR Program as we do not envision the ADR Program as positively enabling, effectuating, or contributing to our short-term and long-term business goals and strategies now and into the foreseeable future," said S.K. Lee. "In addition, the Company's financial costs and expenses incurred in connection with sustaining the ADR Program poses an undue and unnecessary economic burden which we would like to eliminate in moving forward. For these reasons, we have no choice but to terminate the Deposit Agreement and ADR Program in a timely and effective manner."

Pixelplus expects to become a part of the "Free Board" in South Korea in due course, which is South Korea's equivalent of the OTC in the US.

Aptina Surveillance Sensor Wins EDN China Award

Techworks Asia: Aptina has won a Best Product Award in EDN China’s Innovation Awards 2011, for its AR0331 surveillance image sensor, in the Passive Component and Sensor category.

The 3.1MP 1/3-inch AR0331 sensor targets 1080p/60fps HD video surveillance market and is based on 2.2um pixel. The sensor combines full HD video with a WDR capability and built-in adaptive local tone mapping. The binning techniques enable the sensor’s sub 1-lux low light performance.

Applied Materials Targets BSI Sensors Manufacturing

Applied Materials announces Applied Producer Optiva CVD system aimed to manufacture of BSI sensors. "Emerging BSI image sensor designs present a new opportunity for Applied Materials to provide customers with the technology they need to be successful in this rapidly growing market," said Bill McClintock, VP and GM of Applied’s Dielectric Systems and Modules business unit. "The Optiva low temperature process runs on our lightning-fast Producer platform, which is great news for chipmakers looking to satisfy the demand for an estimated 300 million BSI image sensors expected to be needed by 2014."

Producer Optiva system is capable of depositing low temperature, conformal films that boost the low-light performance of the sensor while improving its durability. The system enhances the performance of the microlens by covering it with a tough, thin, transparent film layer that reduces reflections and scratches, and protects it from the environment. Importantly, the Optiva tool is the first CVD system to enable >95% conformal deposition at temperatures less than 200°C. As typical bonding adhesives have thermal budgets of approximately 200ºC, all subsequent processing on these temporarily bonded wafers must be done below 200ºC.

According to iSuppli, three-quarters of all smartphones will be fitted with BSI sensors in 2014, up from just 14% in 2010.

Monday, December 05, 2011

3D ToF Forum

MESA Imaging has opened a 3D ToF Forum. It is mainly related to MESA products but also covers general issues of ToF in its nice FAQ and Applications sections. There are quite a few postings there and it's open for general discussion.

Thanks to ML for sending me the link!

Sunday, December 04, 2011

Varioptic Electrowetting Demo

Varioptic published a nice Youtube video of electrowetting effect:



Another video shows Variptic continuous AF in action.

Saturday, December 03, 2011

NIT Announces 1.3MP WDR Sensor with Extended IR Resopnse

New Imaging Technology announces NSC1105, a 1.3MP WDR sensor with a 10.6µm pixel and DR of more than 140 dB in a single frame time. Thanks to its Native Wide Dynamic Range technology, the NSC1105 does not require any setting or exposure time control and always provides a useful image whatever the illumination conditions are.
The NSC1105 also benefits from an extended spectral response in the IR range and good low light sensitivity:

Friday, December 02, 2011

Samsung to Mass Produce "Optical Sensor in Pixel" LCD Panel

Business Wire: Samsung announces that it began mass production of 40-inch "Optical Sensor in Pixel" LCD panels in November this year.

The Optical Sensor in Pixel LCD panel detects reflected images of an object on the panel using Infrared sensors that are built into the panel. With optical sensor in each pixel of the panel, the new panel can much more accurately assess touch sensitivity compared to existing touch panels. The panel can detect more than 50 touch points simultaneously and can display images with Full HD resolution and wide-angle viewing.

All of the input functions of a keyboard, mouse or scanner can be carried out on the panel itself. The panel can be installed in a variety of applications including table top and wall-mounted types. Its tempered glass is strong enough to withstand external loads over 80 kilograms.

As the panel can perform touch and multi-touch sensing and image display simultaneously, it represents a new paradigm for massively interactive communications, compared to the one-way communication of today’s kiosk touch panels.

The Optical Sensor in Pixel LCD panel has been installed in "Samsung SUR40 for Microsoft Surface", a table-type PC product, co-developed by Samsung and Microsoft. SUR40 has been available for pre-order since last month.

"Our Optical Sensor in Pixel panel has overcome the limitations of touch functionality that have hampered the effectiveness of most interactive displays," said Younghwan Park, SVP sales and marketing team, Samsung Electronics LCD Business. "With the world’s first mass production of an Optical Sensor in Pixel LCD, Samsung Electronics has set its sights on taking the lead in the global interactive display market," he added.

6Sight Panels On Mobile Imaging

Sept. 2011 issue of 6Sight Report published a transcript of discussion on mobile imaging trends from its June 2011 Summit. Few quotes:

Tony Henning, 6Sight senior analyst:

"In digital still cameras, some are actually backing off the resolution to deliver better results. It’s said some sanity is coming to the megapixel race. But is there sanity coming to the mobile megapixel race? Will that race end soon?"

Robbert Emery, OmniVision:

"That’s a very interesting question. I think about it a different way: What can I do with more data? With more pixels?
We look at digital zoom; could be better. We look at image stabilization; could be better. If we look at what happened in the economic downturn, of course, there’s been some slowing in the adoption of higher resolution. But coming out of the economic downturn, the numbers shipped for lower resolution camera phones is lowering, and that for higher resolution phones is increasing. We’re looking at the adoption of 5-, 8-, 10-megapixel and above.

With applications and other ways of using more data, the race is definitely still on for higher resolution.
"

Paul Gallagher, Samsung:

"Right now it’s difficult to say. I agree we saw a slowdown during the economic downturn, but coming out of it, we’re seeing very aggressive adoption of 8 megapixel, and aggressive interest in 12MP. I think what we need to start looking at is we are starting to hit some cost barriers. Historically, the image sensor shrunk the pixel as a means to reduce its cost or to increase the pixel count. But now with the adoption of BSI, you saw a reset of market price. And when you move into sub-micron pixels, you’re probably going to see another reset. The consequences, from the economic point of view, will cause a slowdown. When you start looking at 16-megapixel third-inch optical-format products, the cost may break the model enough that the OEMs start rethinking when enough is enough."

Lars Nord, Sony Ericsson:

"I think it’s about what specifications people use to select the phone. Right now we don’t have many, so resolution
is the one they use. If you compare phones: this has five, this has eight… they’ll take the eight because you get more. More is better? Maybe not every time, but that’s how people think. Until we get some other measures of quality, we will see this megapixel race go on, unfortunately.
"

Other topics covered at the panel discussion are sensors and optics improvements, recent innovations and what's next in mobile imaging.

Thursday, December 01, 2011

New Version of Panasonic ToF D-Imager Works Outdoors

Youtube has a demo video of a new version of Panasonic 3D ToF D-Imager - EKL3106 capable to work outdoors at up to 100,000 lux illumination. The previous D-Imager version EKL3104 was speced up to 20,000 lux.

Sharp Fits OIS, AF, F2.5 Optics and 12MP BSI Sensor into 5.47mm-High Module

Sharp announces a 12.1MP, 1/3.2-inch CMOS camera module with optical image stabilization and autofocus that features the industry’s thinnest profile - 5.47 mm in height. The new RJ63YC100 is intended for use in mobile devices such as smartphones. Sample shipments will begin from December 2, 2011, volume production from January 10, 2012.

The 11.0 (W) x 11.0 (L) x 5.47 (H) mm module has F2.5 lens with horisontal viewing angle of 61 deg. The module supports 1080p video through 4-lane MIPI interface.

Yole Forecasts WLCSP Sensors Growth

Yole Developpement report on WLCSP devices forecasts: "All in all, ‘fan-in’ WLCSP shows the first early signs of a maturing market with price pressure process standardization, but it still grows faster than the average semiconductor packaging market due to the fast growth rates of smartphones and tablet PCs in which WLCSP considerably helps save space and reduce costs."