Enum libcamera::controls::ControlId

source ·
#[repr(u32)]
pub enum ControlId {
Show 48 variants AeEnable = 1, AeLocked = 2, AeMeteringMode = 3, AeConstraintMode = 4, AeExposureMode = 5, ExposureValue = 6, ExposureTime = 7, AnalogueGain = 8, Brightness = 9, Contrast = 10, Lux = 11, AwbEnable = 12, AwbMode = 13, AwbLocked = 14, ColourGains = 15, ColourTemperature = 16, Saturation = 17, SensorBlackLevels = 18, Sharpness = 19, FocusFoM = 20, ColourCorrectionMatrix = 21, ScalerCrop = 22, DigitalGain = 23, FrameDuration = 24, FrameDurationLimits = 25, SensorTemperature = 26, SensorTimestamp = 27, AfMode = 28, AfRange = 29, AfSpeed = 30, AfMetering = 31, AfWindows = 32, AfTrigger = 33, AfPause = 34, LensPosition = 35, AfState = 36, AfPauseState = 37, AePrecaptureTrigger = 38, NoiseReductionMode = 39, ColorCorrectionAberrationMode = 40, AeState = 41, AwbState = 42, SensorRollingShutterSkew = 43, LensShadingMapMode = 44, SceneFlicker = 45, PipelineDepth = 46, MaxLatency = 47, TestPatternMode = 48,
}

Variants§

§

AeEnable = 1

Enable or disable the AE.

\sa ExposureTime AnalogueGain

§

AeLocked = 2

Report the lock status of a running AE algorithm.

If the AE algorithm is locked the value shall be set to true, if it’s converging it shall be set to false. If the AE algorithm is not running the control shall not be present in the metadata control list.

\sa AeEnable

§

AeMeteringMode = 3

Specify a metering mode for the AE algorithm to use. The metering modes determine which parts of the image are used to determine the scene brightness. Metering modes may be platform specific and not all metering modes may be supported.

§

AeConstraintMode = 4

Specify a constraint mode for the AE algorithm to use. These determine how the measured scene brightness is adjusted to reach the desired target exposure. Constraint modes may be platform specific, and not all constraint modes may be supported.

§

AeExposureMode = 5

Specify an exposure mode for the AE algorithm to use. These specify how the desired total exposure is divided between the shutter time and the sensor’s analogue gain. The exposure modes are platform specific, and not all exposure modes may be supported.

§

ExposureValue = 6

Specify an Exposure Value (EV) parameter. The EV parameter will only be applied if the AE algorithm is currently enabled.

By convention EV adjusts the exposure as log2. For example EV = [-2, -1, 0.5, 0, 0.5, 1, 2] results in an exposure adjustment of [1/4x, 1/2x, 1/sqrt(2)x, 1x, sqrt(2)x, 2x, 4x].

\sa AeEnable

§

ExposureTime = 7

Exposure time (shutter speed) for the frame applied in the sensor device. This value is specified in micro-seconds.

Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm.

\sa AnalogueGain AeEnable

\todo Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn’t better be handled through a separate AE mode control.

§

AnalogueGain = 8

Analogue gain value applied in the sensor device. The value of the control specifies the gain multiplier applied to all colour channels. This value cannot be lower than 1.0.

Setting this value means that it is now fixed and the AE algorithm may not change it. Setting it back to zero returns it to the control of the AE algorithm.

\sa ExposureTime AeEnable

\todo Document the interactions between AeEnable and setting a fixed value for this control. Consider interactions with other AE features, such as aperture and aperture/shutter priority mode, and decide if control of which features should be automatically adjusted shouldn’t better be handled through a separate AE mode control.

§

Brightness = 9

Specify a fixed brightness parameter. Positive values (up to 1.0) produce brighter images; negative values (up to -1.0) produce darker images and 0.0 leaves pixels unchanged.

§

Contrast = 10

Specify a fixed contrast parameter. Normal contrast is given by the value 1.0; larger values produce images with more contrast.

§

Lux = 11

Report an estimate of the current illuminance level in lux. The Lux control can only be returned in metadata.

§

AwbEnable = 12

Enable or disable the AWB.

\sa ColourGains

§

AwbMode = 13

Specify the range of illuminants to use for the AWB algorithm. The modes supported are platform specific, and not all modes may be supported.

§

AwbLocked = 14

Report the lock status of a running AWB algorithm.

If the AWB algorithm is locked the value shall be set to true, if it’s converging it shall be set to false. If the AWB algorithm is not running the control shall not be present in the metadata control list.

\sa AwbEnable

§

ColourGains = 15

Pair of gain values for the Red and Blue colour channels, in that order. ColourGains can only be applied in a Request when the AWB is disabled.

\sa AwbEnable

§

ColourTemperature = 16

Report the current estimate of the colour temperature, in kelvin, for this frame. The ColourTemperature control can only be returned in metadata.

§

Saturation = 17

Specify a fixed saturation parameter. Normal saturation is given by the value 1.0; larger values produce more saturated colours; 0.0 produces a greyscale image.

§

SensorBlackLevels = 18

Reports the sensor black levels used for processing a frame, in the order R, Gr, Gb, B. These values are returned as numbers out of a 16-bit pixel range (as if pixels ranged from 0 to 65535). The SensorBlackLevels control can only be returned in metadata.

§

Sharpness = 19

A value of 0.0 means no sharpening. The minimum value means minimal sharpening, and shall be 0.0 unless the camera can’t disable sharpening completely. The default value shall give a “reasonable” level of sharpening, suitable for most use cases. The maximum value may apply extremely high levels of sharpening, higher than anyone could reasonably want. Negative values are not allowed. Note also that sharpening is not applied to raw streams.

§

FocusFoM = 20

Reports a Figure of Merit (FoM) to indicate how in-focus the frame is. A larger FocusFoM value indicates a more in-focus frame. This singular value may be based on a combination of statistics gathered from multiple focus regions within an image. The number of focus regions and method of combination is platform dependent. In this respect, it is not necessarily aimed at providing a way to implement a focus algorithm by the application, rather an indication of how in-focus a frame is.

§

ColourCorrectionMatrix = 21

The 3x3 matrix that converts camera RGB to sRGB within the imaging pipeline. This should describe the matrix that is used after pixels have been white-balanced, but before any gamma transformation. The 3x3 matrix is stored in conventional reading order in an array of 9 floating point values.

§

ScalerCrop = 22

Sets the image portion that will be scaled to form the whole of the final output image. The (x,y) location of this rectangle is relative to the PixelArrayActiveAreas that is being used. The units remain native sensor pixels, even if the sensor is being used in a binning or skipping mode.

This control is only present when the pipeline supports scaling. Its maximum valid value is given by the properties::ScalerCropMaximum property, and the two can be used to implement digital zoom.

§

DigitalGain = 23

Digital gain value applied during the processing steps applied to the image as captured from the sensor.

The global digital gain factor is applied to all the colour channels of the RAW image. Different pipeline models are free to specify how the global gain factor applies to each separate channel.

If an imaging pipeline applies digital gain in distinct processing steps, this value indicates their total sum. Pipelines are free to decide how to adjust each processing step to respect the received gain factor and shall report their total value in the request metadata.

§

FrameDuration = 24

The instantaneous frame duration from start of frame exposure to start of next exposure, expressed in microseconds. This control is meant to be returned in metadata.

§

FrameDurationLimits = 25

The minimum and maximum (in that order) frame duration, expressed in microseconds.

When provided by applications, the control specifies the sensor frame duration interval the pipeline has to use. This limits the largest exposure time the sensor can use. For example, if a maximum frame duration of 33ms is requested (corresponding to 30 frames per second), the sensor will not be able to raise the exposure time above 33ms. A fixed frame duration is achieved by setting the minimum and maximum values to be the same. Setting both values to 0 reverts to using the camera defaults.

The maximum frame duration provides the absolute limit to the shutter speed computed by the AE algorithm and it overrides any exposure mode setting specified with controls::AeExposureMode. Similarly, when a manual exposure time is set through controls::ExposureTime, it also gets clipped to the limits set by this control. When reported in metadata, the control expresses the minimum and maximum frame durations used after being clipped to the sensor provided frame duration limits.

\sa AeExposureMode \sa ExposureTime

\todo Define how to calculate the capture frame rate by defining controls to report additional delays introduced by the capture pipeline or post-processing stages (ie JPEG conversion, frame scaling).

\todo Provide an explicit definition of default control values, for this and all other controls.

§

SensorTemperature = 26

Temperature measure from the camera sensor in Celsius. This is typically obtained by a thermal sensor present on-die or in the camera module. The range of reported temperatures is device dependent.

The SensorTemperature control will only be returned in metadata if a themal sensor is present.

§

SensorTimestamp = 27

The time when the first row of the image sensor active array is exposed.

The timestamp, expressed in nanoseconds, represents a monotonically increasing counter since the system boot time, as defined by the Linux-specific CLOCK_BOOTTIME clock id.

The SensorTimestamp control can only be returned in metadata.

\todo Define how the sensor timestamp has to be used in the reprocessing use case.

§

AfMode = 28

Control to set the mode of the AF (autofocus) algorithm.

An implementation may choose not to implement all the modes.

§

AfRange = 29

Control to set the range of focus distances that is scanned. An implementation may choose not to implement all the options here.

§

AfSpeed = 30

Control that determines whether the AF algorithm is to move the lens as quickly as possible or more steadily. For example, during video recording it may be desirable not to move the lens too abruptly, but when in a preview mode (waiting for a still capture) it may be helpful to move the lens as quickly as is reasonably possible.

§

AfMetering = 31

Instruct the AF algorithm how it should decide which parts of the image should be used to measure focus.

§

AfWindows = 32

Sets the focus windows used by the AF algorithm when AfMetering is set to AfMeteringWindows. The units used are pixels within the rectangle returned by the ScalerCropMaximum property.

In order to be activated, a rectangle must be programmed with non-zero width and height. Internally, these rectangles are intersected with the ScalerCropMaximum rectangle. If the window becomes empty after this operation, then the window is ignored. If all the windows end up being ignored, then the behaviour is platform dependent.

On platforms that support the ScalerCrop control (for implementing digital zoom, for example), no automatic recalculation or adjustment of AF windows is performed internally if the ScalerCrop is changed. If any window lies outside the output image after the scaler crop has been applied, it is up to the application to recalculate them.

The details of how the windows are used are platform dependent. We note that when there is more than one AF window, a typical implementation might find the optimal focus position for each one and finally select the window where the focal distance for the objects shown in that part of the image are closest to the camera.

§

AfTrigger = 33

This control starts an autofocus scan when AfMode is set to AfModeAuto, and can also be used to terminate a scan early.

It is ignored if AfMode is set to AfModeManual or AfModeContinuous.

§

AfPause = 34

This control has no effect except when in continuous autofocus mode (AfModeContinuous). It can be used to pause any lens movements while (for example) images are captured. The algorithm remains inactive until it is instructed to resume.

§

LensPosition = 35

Acts as a control to instruct the lens to move to a particular position and also reports back the position of the lens for each frame.

The LensPosition control is ignored unless the AfMode is set to AfModeManual, though the value is reported back unconditionally in all modes.

This value, which is generally a non-integer, is the reciprocal of the focal distance in metres, also known as dioptres. That is, to set a focal distance D, the lens position LP is given by

\f$LP = \frac{1\mathrm{m}}{D}\f$

For example:

0 moves the lens to infinity. 0.5 moves the lens to focus on objects 2m away. 2 moves the lens to focus on objects 50cm away. And larger values will focus the lens closer.

The default value of the control should indicate a good general position for the lens, often corresponding to the hyperfocal distance (the closest position for which objects at infinity are still acceptably sharp). The minimum will often be zero (meaning infinity), and the maximum value defines the closest focus position.

\todo Define a property to report the Hyperfocal distance of calibrated lenses.

§

AfState = 36

Reports the current state of the AF algorithm in conjunction with the reported AfMode value and (in continuous AF mode) the AfPauseState value. The possible state changes are described below, though we note the following state transitions that occur when the AfMode is changed.

If the AfMode is set to AfModeManual, then the AfState will always report AfStateIdle (even if the lens is subsequently moved). Changing to the AfModeManual state does not initiate any lens movement.

If the AfMode is set to AfModeAuto then the AfState will report AfStateIdle. However, if AfModeAuto and AfTriggerStart are sent together then AfState will omit AfStateIdle and move straight to AfStateScanning (and start a scan).

If the AfMode is set to AfModeContinuous then the AfState will initially report AfStateScanning.

§

AfPauseState = 37

Only applicable in continuous (AfModeContinuous) mode, this reports whether the algorithm is currently running, paused or pausing (that is, will pause as soon as any in-progress scan completes).

Any change to AfMode will cause AfPauseStateRunning to be reported.

§

AePrecaptureTrigger = 38

Control for AE metering trigger. Currently identical to ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER.

Whether the camera device will trigger a precapture metering sequence when it processes this request.

§

NoiseReductionMode = 39

Control to select the noise reduction algorithm mode. Currently identical to ANDROID_NOISE_REDUCTION_MODE.

Mode of operation for the noise reduction algorithm.

§

ColorCorrectionAberrationMode = 40

Control to select the color correction aberration mode. Currently identical to ANDROID_COLOR_CORRECTION_ABERRATION_MODE.

Mode of operation for the chromatic aberration correction algorithm.

§

AeState = 41

Control to report the current AE algorithm state. Currently identical to ANDROID_CONTROL_AE_STATE.

Current state of the AE algorithm.

§

AwbState = 42

Control to report the current AWB algorithm state. Currently identical to ANDROID_CONTROL_AWB_STATE.

Current state of the AWB algorithm.

§

SensorRollingShutterSkew = 43

Control to report the time between the start of exposure of the first row and the start of exposure of the last row. Currently identical to ANDROID_SENSOR_ROLLING_SHUTTER_SKEW

§

LensShadingMapMode = 44

Control to report if the lens shading map is available. Currently identical to ANDROID_STATISTICS_LENS_SHADING_MAP_MODE.

§

SceneFlicker = 45

Control to report the detected scene light frequency. Currently identical to ANDROID_STATISTICS_SCENE_FLICKER.

§

PipelineDepth = 46

Specifies the number of pipeline stages the frame went through from when it was exposed to when the final completed result was available to the framework. Always less than or equal to PipelineMaxDepth. Currently identical to ANDROID_REQUEST_PIPELINE_DEPTH.

The typical value for this control is 3 as a frame is first exposed, captured and then processed in a single pass through the ISP. Any additional processing step performed after the ISP pass (in example face detection, additional format conversions etc) count as an additional pipeline stage.

§

MaxLatency = 47

The maximum number of frames that can occur after a request (different than the previous) has been submitted, and before the result’s state becomes synchronized. A value of -1 indicates unknown latency, and 0 indicates per-frame control. Currently identical to ANDROID_SYNC_MAX_LATENCY.

§

TestPatternMode = 48

Control to select the test pattern mode. Currently identical to ANDROID_SENSOR_TEST_PATTERN_MODE.

Trait Implementations§

source§

impl Clone for ControlId

source§

fn clone(&self) -> ControlId

Returns a copy of the value. Read more
1.0.0 · source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
source§

impl Debug for ControlId

source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more
source§

impl From<ControlId> for u32

source§

fn from(enum_value: ControlId) -> Self

Converts to this type from the input type.
source§

impl PartialEq for ControlId

source§

fn eq(&self, other: &ControlId) -> bool

Tests for self and other values to be equal, and is used by ==.
1.0.0 · source§

fn ne(&self, other: &Rhs) -> bool

Tests for !=. The default implementation is almost always sufficient, and should not be overridden without very good reason.
source§

impl TryFrom<u32> for ControlId

source§

type Error = TryFromPrimitiveError<ControlId>

The type returned in the event of a conversion error.
source§

fn try_from(number: u32) -> Result<Self, TryFromPrimitiveError<Self>>

Performs the conversion.
source§

impl TryFromPrimitive for ControlId

source§

type Primitive = u32

source§

const NAME: &'static str = "ControlId"

source§

fn try_from_primitive( number: Self::Primitive, ) -> Result<Self, TryFromPrimitiveError<Self>>

source§

impl Copy for ControlId

source§

impl Eq for ControlId

source§

impl StructuralPartialEq for ControlId

Auto Trait Implementations§

Blanket Implementations§

source§

impl<T> Any for T
where T: 'static + ?Sized,

source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
source§

impl<T> Borrow<T> for T
where T: ?Sized,

source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
source§

impl<T> CloneToUninit for T
where T: Clone,

source§

unsafe fn clone_to_uninit(&self, dst: *mut T)

🔬This is a nightly-only experimental API. (clone_to_uninit)
Performs copy-assignment from self to dst. Read more
source§

impl<T> From<T> for T

source§

fn from(t: T) -> T

Returns the argument unchanged.

source§

impl<T, U> Into<U> for T
where U: From<T>,

source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

source§

impl<T> ToOwned for T
where T: Clone,

source§

type Owned = T

The resulting type after obtaining ownership.
source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

source§

type Error = Infallible

The type returned in the event of a conversion error.
source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.