Conducting forensics of mobile devices to find signs of a potential compromise
77 comments
·March 17, 2025transpute
mschuster91
> Since Apple won't allow iDevice owners to access an unredacted raw disk image for forensics, iOS malware detection tools are hamstrung.
And it's not just Apple.
Android is just as bad, and even worse for the user because while iOS backups are consistent in backing up everything sans stuff in the Secure Enclave (i.e. credit card and eSIM keys), in Android support for backup is optional for apps and there are many games who just outright don't do any kind of backup.
fc417fc802
This is true and I resent it. However, at least you have the option of installing a ROM that supports toggling adb root out of the box. That alone solves 99% of the issues I have with Android in practice.
mschuster91
> However, at least you have the option of installing a ROM that supports toggling adb root out of the box.
That's not valid for all devices, all Samsungs need a cooldown of one week (Knox lock, presumably to thwart people from rooting stolen devices to bypass antitheft), all modern Androids require a full wipe of the device as part of rooting so it's useless for forensics, and a shitload of apps will flat out refuse to work on rooted devices - forget many games, forget anything with streaming, forget banking apps.
firefax
>iOS backups are consistent in backing up everything sans stuff in the Secure Enclave
Do they now back u TOTP generators? I lost access to an account I had since my teens because when restoring from backup, I had no MFAs in my Google Authenticator. Since I had imported my teenage cell # into Google Voice, when the backup codes I'd generated for the account failed to restore access, I lost access to my gmail + my phone number I'd had for decades, despite taking what seemed to be reasonable steps.
(I'd backup my iPhone to my laptop, and backup my laptop to a USB hard drive, one of which would live in my house and another in a secure offsite location.)
RGamma
Nope not in general, gotta use multiple second factors and/or the second factor reset key.
mschuster91
Google Authenticator at least does support cloud uploads.
rvnx
The fact that iPhones are hard to dump is actually the main protection against threats when your phone is stolen or taken away from you (from a more or less legitimate-looking organization or person). It's a pretty good thing overall.
transpute
Why must that prevent backup from an Apple Configurator MDM-supervised device that is paired to an admin Macbook, with MDM policy to prevent mobile pairing with any other Macbook? There is a full cryptographic chain to verify the supervising device, which already has full MDM policy control of the mobile device. What security is being added by preventing that authorized supervisor from doing a forensic backup?
null
matheusmoreira
> provide optional remote attestation to verify OS and baseband integrity
And lock us out of our computing freedom while they're at it.
Remote attestation enables discrimination against free computers owned by users rather than corporations. They could theoretically allow users to set their own keys but it's not like apps and services are gonna trust people's personal attestation keys, they're only gonna trust Apple's and Google's.
This is among the most dangerous developments in cryptography to date and it's gonna end free computing as we know it today. Before this, cryptography used to empower people like us. Now it's the tool that will destroy our freedom and everything the word "hacker" ever stood for. Malware is a small price to pay to avoid such a fate.
It's not going to be "optional" either. Every major service is going to use it. Guaranteed.
transpute
> Remote attestation enables discrimination against free computers owned by users rather than corporations.
Not when my mobile device is attesting to my home server with OSS attestation software, or my USB Armory with OSS firmware for local "remote" attestation. GrapheneOS can attest to a 2nd mobile device running GrapheneOS, or a web verifier. This is not rocket science. Provide a mobile setting for attestation server URL.
> Every major service is going to use it. Guaranteed.
Hence there must be a mandatory option to define your attestation server. Advocating for the right to specify and/or host your arbiter of device trust (including firmware RoT) will do infinitely more for freedom than arguing against cryptography.
LoganDark
> Not when my mobile device is attesting to my home server with OSS attestation software, or my USB Armory with OSS firmware for local "remote" attestation. GrapheneOS can attest to a 2nd mobile device running GrapheneOS, or a web verifier. This is not rocket science. Provide a mobile setting for attestation server URL.
No, dude. Look at Google SafetyNet / Play Integrity. It's used by banking apps, streaming apps, certain games, and much, much more, to lock out devices that don't pass. I believe one of the last Android devices that will ever be able to pass SafetyNet while rooted is the OnePlus 7 Pro. Not that I'm ever going to tweak on Android again until TWRP adds a setting to disable OpenRecoveryScript, since a complete lack of prompting for consent is how I had my last major data loss.
(Apparently it would kill them to add anything like a "script execution in 5 seconds, cancel?" popup.)
matheusmoreira
This has nothing to do with attestation servers. It's about who the corporations trust. Namely, each other.
Your attestation server doesn't matter. The corporations are not gonna trust any attestation provided by your home server running open source software under your control. They're not gonna trust GrapheneOS's AOSP attestation where you provide your own keys. Simply because your open source software has the power to straight up wipe out their entire business models if left unchecked. They'll deny you service if you use it.
Think about it. You can reverse engineer their apps and network protocols and build better software that doesn't advertise to users, that doesn't collect their information, that automates boring tasks, that copies data they don't want copied, that transmits data they want censored. This stuff directly impacts their bottom line and they absolutely want cryptographic proof that you are not doing anything of the sort.
They're not gonna trust your keys. They're gonna trust Google's and Apple's. Because their interests are aligned with Google's and Apple's, and not with yours.
They've set things up so that they own the computers. They're just generously letting us use them, so long as we follow their rules and policies. If we hack the computer to take control of what should be ours to begin with, they call it "tampering". And now they have hardware cryptographic evidence of this "tampering". This allows them to discriminate against us, exclude us. Since it's hardware cryptography, it's exceedingly difficult to fake or bypass.
This is the future. Either you use a corporate pwned computer, or you're ostracized from digital society. Can't log into bank accounts. Can't exchange messages over popular services. Can't even play stupid video games. Can't do much of anything unless somehow hackers create a parallel society where none of this attestation business exists.
What good is free software if you can't use it? It's worthless.
fc417fc802
IIRC wasn't it Librem that wanted to have the device attest itself to the user (ie a second device)?
Agreed though. Any major vendor deploying this globally and making it available to developers without restriction would be an affront to our freedom.
matheusmoreira
> Any major vendor deploying this globally and making it available to developers without restriction would be an affront to our freedom.
Google and Apple have already done it. And one day it'll show up in desktops too.
pogue
Would DNS logs suffice? You could use service that offers logs of DNS like NextDNS or a Pi-Hole to watch DNS traffic from the device, but you wouldn't know which app sent it and for what purpose.
transpute
It might be useful in some cases. Charles Proxy on iOS uses a local VPN profile to associate network traffic with applications.
that_lurker
Unless I remember incorrectly doesn’t iOS do an integrity verification at system boot.
transpute
Has anyone seen an iOS device fail to boot due to an integrity violation?
Whatever it's verifying is insufficient to stop persistent iOS malware, hence the existence of the MVT toolkit, which itself can only identify a small subset of real-world attacks. For evidence, look no further than the endless stream of zero-day CVEs in Apple Security Updates for iOS. Recovery from iOS malware often requires DFU (Device Firmware Update) mode reinstallation from a separate device running macOS.
Non-persistent iOS malware can be flushed by a device hot-key reboot which prevents malware from simulating the appearance of a reboot.
FirmwareBurner
>Non-persistent iOS malware can be flushed by a device hot-key reboot which prevents malware from simulating the appearance of a reboot.
The question is how often do users usually reboot their phone these days?
bri3d
> Whatever it's verifying is insufficient to stop persistent iOS malware, hence the existence of the MVT toolkit
One of these assertions absolutely does not support the other; the newest persistent malware detected on iOS by MVT is from 2023 and targeted iOS 14. In iOS 15, Apple introduced System volumes and SSV. The OS lives on a separate APFS volume snapshot which is verified using a hash tree (think like dm-verity, although the implementation is at a slightly different level). Even Operation Triangulation couldn't achieve reboot persistence for their implant (which Kapersky call TriangleDB); rebooting would require re-exploitation.
This also affects your argument about "forensic" imaging (also - if you're asking the device for the image, it's always a logical extraction; if you don't trust the device, why do you trust the backup data you asked it for?): post-iOS-15, unless boot security was compromised, in which case you have bigger problems, you'll get the same bytes back for system files anyway.
saagarjha
Persistent iOS malware is quite rare these days.
Joel_Mckay
Most modern malware is not disk resident, as it has a higher probability of persisting by re-infection with an undocumented zero-day.
For example, people that play games that bind the GPS location services will find interruptions magically stop for awhile after a cold power-off, and power-on restart. Or the battery performance suddenly stops quickly losing power in standby, as recording/image capture was burning power and data budgets.
Ultimately, a smartphone is impossible to fully secure, as the complexity has a million holes in it regardless of the brand. And Gemini is a whole can of worms I'd rather not discuss without my lawyer present. =3
bri3d
Starting in iPadOS and iOS 15, iOS and macOS use a similar Signed System Volume concept and the System volume's integrity is verified.
heavymetalpoizn
[dead]
mindslight
I recently had the "pleasure" of reading over a criminal forensic investigation report. It was harrowing. The report was basically like "we ran virus check and it reported clean so nobody could have accessed the system remotely" and then it moved right along to the next thing. The logic felt more dubious than some of the court scenes from Idiocracy. And it had been produced for defense counsel and paid for by the defendant.
vaylian
Did the defendant argue that the system was compromised and that they therefore did not commit the crime?
mindslight
I have no idea what arguments were actually made. But that concern was raised somewhere along the chain asking for my (informal technical) opinion.
It's obviously quite difficult to prove a negative in general, but the complete lack of any standard of care then presented as an "expert opinion" for the defense was astounding.
(FWIW this was a MS Windows machine, and I think the AV was just Windows Defender)
dr_zoidberg
The lack of standards falls on the acting part. I ran a quick search and found that SWGDE best practices guides and documents do consider the case for the presence of malware on the digital evidence sources on many different scenarios [1]. Having an "expert" who is unaware of these guides is another story.
[1] https://www.swgde.org/?swp_form%5Bform_id%5D=1&swps=malware
fdb345
the courts and police dont care about nonsensical phone forensics. the entire encrochat thing was built on lies, the courts lapped it up
pogue
I'd be curious if anyone has tried this for Android and what kind of stuff it's checking for. Sideloaded APKs can often contain malicious stuff, but it's nearly impossible to know if it's doing anything suspicious unless you open it up with a tool like Apktool [1] or run it on Triage [2] as it supports Android and watch what it's doing. Most antivirus for Android is pretty much a joke, as far as I'm concerned.
[1] https://github.com/iBotPeaches/Apktool?tab=readme-ov-file
[2] https://tria.ge/
6stringmerc
Does the iPhone / iOS track the profiles of the machines it is physically connected with and when “Allow Access” is selected? I ask because I did not have face authentication or a password on my phone and my ex-landlords illegally obtained my exempt property and I would like to know if they plugged it in to their computer and potentially obtained personal files from it. Yes I know the lack of security was an oversight and failure on my part. I accept that. However, they also tried to steal my car and sell it and refuse to return my property they are not legally entitled to possess (“tools of trade” under Texas law). The legal process takes time so I’m just curious if such a forensics investigation is possible.
sprdnv
I think that if your iOS version is latest and some basic code to unlock your phone is set and even if you're not logged in, it will not make storage available because you wouldn't be able to set the options to up backup your data, besides allowing it
sprdnv
Hard to tell with Apple stuff. The idea on the "way of getting it to device to run persistently, not until reboot" is quite different too often. There was Pegasus
sprdnv
Try to ask support in a technical manner. Provide ID if asked
sprdnv
My hope is: no way for latest public update
sprdnv
I mean did you have some sort of code (I cannot remember the name) set. Or what did you see if screen is off and you hard reset it, or at least soft reset, or just locking it with power button and waking in up the same way?
truekonrads
iVerify uses diagnostic logs for hunting. Give it a go
iOS, https://docs.mvt.re/en/latest/ios/methodology/
> You will need to decide whether to attempt to jailbreak the device and obtain a full filesystem dump, or not.
Since Apple won't allow iDevice owners to access an unredacted raw disk image for forensics, iOS malware detection tools are hamstrung. The inability to fully backup devices means that post-intrusion device restore is literally impossible. Only a new OS version can be installed, then a subset of the original data can be restored, then every app/service needs to re-establish trust with this newly "untrusted" (but more trustworthy than the previously trusted-but-compromised) device.
In theory, Apple could provide their own malware analysis toolset, or provide optional remote attestation to verify OS and baseband integrity.
In the absence of persistent disk artifacts, the next best option is behavioral analysis, e.g. usage anomalies ("dog that did not bark") in CPU, battery, storage or network. Outbound network traffic can be inspected by a router and compared against expected application and system traffic. This requires an outbound firewall where rules can specify traffic by wildcard domain names, which are widely used by CDNs. Apple helpfully provides a list of domains and port numbers for all Apple services.