Camera and audio injection is only available for apps that QA Wolf resigns during installation. It is not available for system apps or Safari.
Examples
Inject an image into the camera feed:
//Inject Camera Image
const ios = await import("@qawolf/run-globals-ios");
const bundleId = process.env.BUNDLE_ID; // Bundle ID of app being tested
//path to QA Wolf remote storage
const storagePath = process.env.STORAGE_PATH // QA Wolf remote storage
const imagePath = `${storagePath}/large.jpg`;
const cleanup = await ios.injectCamera(driver, bundleId, {
data: imagePath,
type: "image", // optional — inferred from .jpg/.png/.mp4/etc.
});
// ... run your assertions while the camera feed is mocked ...
await cleanup();
Once the config is pushed, the camera preview updates to the injected image and loops continuously.
Inject a video into the camera feed:
// Inject Camera Video
const ios = await import("@qawolf/run-globals-ios");
const bundleId = process.env.BUNDLE_ID; // Bundle ID of app being tested
const storagePath = process.env.STORAGE_PATH // QA Wolf remote storage
const videoPath = `${storagePath}/wolf.mp4`;
const cleanup = await ios.injectCamera(driver, bundleId, {
data: videoPath,
type: "video", // optional — inferred from .mp4/.mov/.m4v/.avi
});
// ... run your assertions while the camera feed plays the video ...
await cleanup();
Inject audio into the microphone:
// Inject Audio
const ios = await import("@qawolf/run-globals-ios");
const bundleId = process.env.BUNDLE_ID; // Bundle ID of app being tested
const storagePath = process.env.STORAGE_PATH // QA Wolf remote storage
const audioPath = `${storagePath}/sample_audio_input.wav`;
const cleanup = await ios.injectAudio(driver, bundleId, {
data: audioPath,
});
// ... run your assertions while audio is being injected ...
await cleanup();
// Then trigger microphone input in your test
When to use
- Your app scans QR codes, documents, or other visual inputs and you want to test that flow without a physical camera setup
- Your app records video and you need to verify the recording pipeline with deterministic content
- Your app records audio or processes microphone input and you need to test that flow with known audio data
- You need to run the same scenario repeatedly with consistent inputs
Supported file types
Images: Any format supported by UIImage — JPG, PNG, HEIC, GIF, BMP, TIFF, WebP
Video: Any format supported by AVAsset — MP4, MOV, M4V with H.264/HEVC codecs
Audio: WAV
Full sample test
import { flow } from "@qawolf/flows/ios";
import * as ios from "@qawolf/run-globals-ios";
export default flow(
"iOS Media - Camera Photo Injection",
"iOS - iPhone 15 (iOS 26)",
async ({ wdio, test, ...testContext }) => {
await test("iOS Media - Camera Photo Injection", async () => {
//--------------------------------
// Arrange:
//--------------------------------
// Install and Launch Trot app
const driver = await wdio.startIos({
"appium:app": process.env.IOS_APP_STAGING,
"appium:settings[respectSystemAlerts]": true,
"appium:autoAcceptAlerts": true,
"appium:iosInstallPause": "5000",
});
const imageName = "large.jpg";
const baseDir = process.env.BASE_IMAGE_DIR
const imagePath = `${baseDir}/${imageName}`;
// Tap "Media"
await driver
.$(
`-ios predicate string:name == 'Media' AND type == 'XCUIElementTypeButton'`,
)
.waitForDisplayed();
await driver
.$(
`-ios predicate string:name == 'Media' AND type == 'XCUIElementTypeButton'`,
)
.click();
// Tap Video Recording
await driver
.$(
`-ios predicate string:name == 'Video Recording' AND type == 'XCUIElementTypeButton'`,
)
.waitForDisplayed();
await driver
.$(
`-ios predicate string:name == 'Video Recording' AND type == 'XCUIElementTypeButton'`,
)
.click();
// Tap AVCaptureMovieFileOutput
await driver
.$(
`-ios predicate string:name == 'AVCaptureMovieFileOutput' AND type == 'XCUIElementTypeButton'`,
)
.waitForDisplayed();
await driver
.$(
`-ios predicate string:name == 'AVCaptureMovieFileOutput' AND type == 'XCUIElementTypeButton'`,
)
.click();
// Observe "Start Recording" button
await driver
.$(
`-ios predicate string:name == 'Start Recording' AND type == 'XCUIElementTypeStaticText'`,
)
.waitForDisplayed({ timeout: 10000 });
//--------------------------------
// Act:
//--------------------------------
// Inject image as the camera feed
await ios.injectCamera(driver, process.env.APP_ID, {
data: `${imagePath}`,
type: "image",
});
await driver.pause(3000);
//--------------------------------
// Assert:
//--------------------------------
// Assert Screenshot of Image File Displaying in Live Preview
const previewElement = driver.$(
`-ios predicate string:name == 'livePreview' AND type == 'XCUIElementTypeOther'`,
);
await previewElement.waitForDisplayed({ timeout: 5000 });
await wdio
.expect(driver)
.toHaveScreenshot(
previewElement,
"video_recording_image_preview",
{ maxMisMatchPercentage: 5 },
);
});
},
);