iLEAPP Parsers & Photos.sqlite Queries

After recently updating the Photos.sqlite queries I thought it might be time for me to build these queries into the great open-source iLEAPP project. I thought this might be a good way to document how I have converted the queries into iLEAPP parsers, additionally the types of artifacts being parsed using iLEAPP. The parsers are based primarily on iOS versions 15 – 18, but some of these updates have been constructed to work with older iOS versions. 

iOS Versions used to create parsers:

  18.0 Beta 117.4.1
17.417.3.17.2.1
17.016.7.416.1.2
16.115.8.215.6
15.4.114.714.3
13.4.112.4.811.2.1

iOS 18 Beta 1 parsers and queries are preliminary based on early testing. Artifacts and parsers have been included in iLEAPP. When approved iLEAPP should have the ability to parse the Photos.sqlite data if you have an iOS 18 iTunes Backup or a Cellebrite UFED Advanced Logical acquisition which contains a */PhotoData/Photos.sqlite.

File paths of the Photos.sqlite used to create queries:

Photos.sqlite databases used can be found at the following file paths:

iOS Photos Application (com.apple.MobileSlideShow) database:

  • \private\var\mobile\Media\PhotoData\Photos.sqlite

iOS Syndication Photos Library (also known as Shared with You) database:

  • \private\var\mobile\Library\Photos\Libraries\Syndication.photoslibrary\database\Photos.sqlite

Photos.sqlite queries and iLEAPP parsers:

Even though most of the updated queries and parsers will work on both Photos.sqlite databases (PhotoData-Photos.sqlite and Syndication Photos Library-Photos.sqlite), I have included a list of parser names and brief description of the artifact/data parsed from the database. The iLEAPP parsers have been assigned sequence numbers (Ph*) that can be referenced when discussing a particular parser.

Based upon requests several of the Photos.sqlite parsers have been set to be excluded by default. Most of the parsers will require an iLEAPP user to manually select the parsers. The parsers that have been excluded by default will be noted “Excluded By Default” next to the parser name.

Ph1-10 Photos.sqlite iLEAPP Parsers:

Parser names starting with Ph1-10 are some core artifact parsers. Ph1-2 will provide basic asset data while 2-10 will parse data about specific artifacts like assets that have been recently deleted, hidden, and/or last viewed.

Ph1BasicAssetData.py:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets listed within Photos.sqlite databases. This parser should be used as a preliminary or starting point for your analysis. This parser will provide one record per zAsset z_PK and can be used when conducting manual verification to the actual database. This parser will include the following data and others:

  • Identifiers, such as: zAsset z_PK, zAdditionalAssetAttributes z_PK, Master Fingerprint, UUID
  • Timestamps: Camera Roll Sort, Created, Add, Modification, EXIF string, Shared, Trash, Last Viewed timestamps
  • File path(s)
  • Filenames and Original filenames
  • Imported by (bundle) information
  • Visibility State – Is the asset visible within the camera roll
  • Saved Asset Type – Type of asset, provides shared indications
  • Bundle Scope – Provides iCloud shared indications
  • Syndication State – Asset syndication state from Syndication Photos Library
  • Active Library Scope Participation State – indicate if the asset is in an iCloud Shared Photo Library

Ph2BasicAssetandAlbumData.py:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide basic asset record data for assets listed within Photos.sqlite databases. This parser should be used as the second parser/analysis point. This parser will provide at least one record per zAsset z_PK and additional records for each asset that have been associated with an album. This will include the data from Ph1BasicAssetData.py and some additional data:

  • Extended Attributes camera data
  • Cloud Master data
  • Location Data
  • Cloud Master Metadata indicators (see Ph10AssetParsedEmbeddedFiles.py for more information)
  • Generic Album data for the asset: Album Kind, Title, asset Counts, and others

Ph3TrashedRemovedfromCamRoll.py:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

Apple.com Recently Deleted Photos

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets that have been marked as recently deleted, indicates of whom may have deleted shared photo library assets, and syndication photos library assets that have been removed from the camera roll view.

  • PhotoData/Photos.sqlite parser: This will include the data for assets that have been marked as recently deleted to include the following:
    • Indicators of Shared Photos Library Participant whom trashed an asset
    • Share Participant key identifier, if available an Email and/or Phone Number might be listed
  • Syndication.photoslibrary/database/Photos.sqlite parser: This will include data for shared with you syndication photos library assets that were displayed in the camera roll then, due to user interaction, removed from camera roll viewing. This is based on the assets zAsset Syndication State status.

Ph4Hidden.py:

PhotoData/Photos.sqlite (iOS11-18):

Apple.com Photos Hidden

This iLEAPP parser and embedded SQLite query will provide basic asset record data for assets that have been marked as hidden and only applies to the PhotoData/Photos.sqlite database.

Ph5HasLocations.py:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

Apple.com Photos Locations

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets that have location data. This parser is based upon an asset record having valid data within the zAsset zLatitude and zExtendedAttributes zLatitude fields. The two Photos.sqlite databases will have different assets with different location data. If the goal is to analyze assets with location data, please review the results for both databases.

You will also want to review the columns that contain decoded data from embedded property lists from zAdditionalAssetAttributes – zShiftedLocationData, zAdditionalAssetAttributes – zReverseLocationData, and zCloudMasterMetadata zData fields. These embedded property lists are also decoded within the Ph10AssetParsedEmbeddedFiles.py parser.  

Ph6ViewedPlayData.py:

PhotoData/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets with data indicating the asset was viewed, played, and/or if supported by the iOS version, assets that have a zAdditionalAssetAttributes zLastViewedDate timestamp (> iOS16.6).

Ph7Favorite.py:

PhotoData/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets with data indicating the asset was favorited via a zAsset zFavorite field value.

Ph8HasAdjustment.py:

PhotoData/Photos.sqlite (iOS11-18):

Apple.com Photos Adjustments

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets with data indicating the asset was adjusted or has a mutation. These records are based on assets that have a value in zAsset zHasAdjustments field that indicates the asset was adjusted. This parser will include the following data and others:

  • Basic asset data: file path, filename, original filename, primary keys, fingerprint, uuid and others
  • zAdditionalAssetAttributes zEditedBundleID
  • zUnmanagedAdjustment table fields

Ph9BurstAvalanche.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS11-18):

Apple.com Burst Mode Photos

This iLEAPP parser and embedded SQLite query will provide basic asset records for assets with data indicating the asset was a part of an avalanche / burst capture. These records are based on assets that have a value in zAsset zAvalanchePickType or zAdditionalAssetAttributes zCloudAvalanchePickType field that indicates the asset was a part of an avalanche / burst media capture using the native Apple Camera Application. Some of these media files are not natively displayed in the camera roll. The results of this parser will allow the analysis of assets displayed and not displayed in the camera roll. This parser will include the following data and others:

  • Basic asset data: file path, filename, original filename, primary keys, fingerprint, uuid and others
  • zAsset zAvalanchePickType
  • zAdditionalAssetAttributes zCloudAvalanchePickType
  • zUnmanagedAdjustment table fields

Ph10AssetParsedEmbeddedFiles.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide asset records for assets that have embedded files from the following fields. This separate parser was created due to the errors that could possibly occur based upon how the data could be encoded within the field(s) and different iOS versions. This parser has been excluded from the default iLEAPP parsers. Please run this parser during a separate instance of iLEAPP due to the length of time it might take to process and the number of files that could be exported from the db. As noted above, this parser will work for both Photos.sqlite databases. The embedded files are being exported to the iLEAPP report folder. The following list of fields from which embedded data is being parsed will continue to go as research and testing continues:

  • Basic asset data: file path, filename, original filename, primary keys, fingerprint, uuid and others
  • zAdditionalAssetAttributes zShiftedLocationData – plist containing location data
  • zAdditionalAssetAttributes zReverseLocationData – plist containing location data
  • zCloudMasterMetadata zData – plist containing exif and other metadata about the asset
  • zShare zPreviewData – blob containing image file for a preview of the asset shared.
  • zShareParticipant zNameComponents – plist containing name information for person who shared an asset via iCloud Shared Link or iCloud Shared Photo Library

Ph15PeopleandDetFacesNAD.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS14-18):

Apple.com Photos People and Faces

This iLEAPP parser and embedded SQLite query will provide records from the zDetectedFace, zPerson, zDetectedFacePrint, zFaceCrop, and zDetectedFaceGroup Photos.sqlite tables. This parser will not include asset record data such as file names and dates (NAD). This parser and embedded SQLite query will include results not found in Ph16AssetPeopleandDetFaces.py, because this parser will include faces and person data not associated with a specific asset, instances where zDetectedFace – zAsset or zAssetForFace has no value.

Ph16AssetPeopleandDetFaces.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS14-18):

Apple.com Photos People and Faces

This iLEAPP parser and embedded SQLite query will provide asset records for assets that have been analyzed by the OS and have been identified as media containing faces, people and pets. This parser and SQLite query will parse asset records that have a related zDetectedFace z_PK value. This parser will provide basic asset data, face data, people and pets data.

Ph15PeopleandDetFacesNAD.py and Ph16AssetPeopleandDetFaces.py will contain data from the people and face tables which still require additional research and decoding which should be verified. Unfortunately, I don’t have a lot of people volunteering to have their photo captured and used as test data. In my opinion, this would be a great set of parsers to be improved upon during a digital forensics course school project.

These parsers include decoded property lists and embedded jpg files. Due to these property lists and embedded files this parser has been excluded from running by default. After running this parser, remember to check the report folder for exported files.

Ph20-29 Photos.sqlite iLEAPP Parsers:

Parser names starting with Ph20-29 are parsers focused around Albums (Non-Shared Albums and Shared Albums) and assets that have been associated with an album(s). Because the way Apple iOS Photos.sqlite data is structured these parsers will also include Syndication Photos Library (Shared with You) assets from both Photos.sqlite databases. If “NAD” is included in the parser name this is an indication that the parser will not include asset data (No Asset Data – NAD). NAD parsers will include specific data for the album and/or the Shared with You conversation(s).

Ph20AlbumsNAD.py:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide album record data for zGenericAlbum table listed within Photos.sqlite databases. This parser should be used as the first parser/analysis point when investigating the albums found within the database. This parser will also detail albums that may exist which do not contain assets. The following are some of the fields listed within the parser output:

  • zGenericAlbum Creation Date – when the album was created (iOS14-18)
  • zGenericAlbum Start Date – date for the first asset (chronology) in the album
  • zGenericAlbum end Date – date for the last asset (chronology) in the album
  • zGenericAlbum Kind – type of album. There are several values that I have not been able to decode, but most of the important album kinds have been decoded.
  • zGenericAlbum Title – user and system titled albums
  • Imported by bundle ID information
  • Number of photos and videos in the album
  • Album trash status
  • UUID and GUID

Ph21AlbumsNonSharedNAD.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide Non-Shared Album record data from zGenericAlbum table where the zGenericAlbum zKind values is 2. This parser will not contain asset data, only album data. This parser should be used as the second parser/analysis point when investigating the Non-Shared Albums found within the database. This parser will also detail albums that may exist which do not contain assets. The following are some of the fields listed within the parser output:

  • Parent Root/Album/Folder data
  • zGenericAlbum Dates
  • zGenericAlbum Title and unique identifiers (UUID and GUID)
  • zGenericAlbum Kind – type of album. There are several values that I have not been able to decode, but most of the important album kinds have been decoded.
  • Album trash status and other data

Ph22AssetsInNonSharedAlbums.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS11-18):

Apple.com Photo Albums

This iLEAPP parser and embedded SQLite query will provide asset records for assets in Non-Shared Albums. This will include asset records and album records for assets that have a zGenericAlbum zKind (2) value indicating the asset is in a Non-Shared Album. The following are some of the fields listed within the parser output:

  • Basic Asset data: asset dates, file names, SPL status, unique identifiers, and others 
  • Album data: including parent, dates, title, trash status, unique identifiers, and others
  • zGenericAlbum Title and unique identifiers (UUID and GUID)

Ph23AlbumsSharedNAD.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS11-18):

This iLEAPP parser and embedded SQLite query will provide Shared Album record data from zGenericAlbum table where the zGenericAlbum zKind values is 1505. This parser will not contain asset data, only album data. This parser should be used as the second parser/analysis point when investigating the Shared Albums found within the database. This parser will also detail albums that may exist which do not contain assets. In addition to the above listed album data, this parser will also include the following fields to provide insight into the invites for the shared album(s) and their assets:

  • zCloudSharedAlbumInviteRecord zIsMine – Indication if the shared album invite belongs to the examined device Apple ID
  • zCloudSharedAlbumInviteRecord zInvitationState – will provide indication if the invite is pending, accepted, declined, and/or others not yet decoded.
  • zCloudSharedAlbumInviteRecord zInviteeSubscriptionDate – timestamp when the invitee handled the invite.
  • zCloudSharedAlbumInviteRecord Invitee data – Invitee name values and email

Ph24AssetsInSharedAlbums.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS11-18):

Apple.com Photo Shared Albums

This iLEAPP parser and embedded SQLite query will provide asset records for assets in Shared Albums. This will include asset records and album records for assets that have a zGenericAlbum zKind (1505) value indicating the asset is in a Shared Album. The following are some of the fields listed within the parser output:

  • Basic Asset data: asset dates, file names, SPL status, unique identifiers, and others 
  • Album data: including parent, dates, title, trash status, unique identifiers, and others
  • zCloudSharedAlbumInviteRecord zIsMine – Indication if the shared album invite belongs to the examined device Apple ID
  • zCloudSharedAlbumInviteRecord zInvitationState – will provide indication if the invite is pending, accepted, declined, and/or others not yet decoded.
  • zCloudSharedAlbumInviteRecord zInviteeSubscriptionDate – timestamp when the invitee handled the invite.
  • zCloudSharedAlbumInviteRecord Invitee data – Invitee name values and email

Ph25SWYConvAlbumsNAD.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide Syndication Photos Library Shared with You Conversation record data from zGenericAlbum table where the zGenericAlbum zKind values is 1509. This parser will not contain asset data, only Syndication Photos Library Shared with You Conversation record data. When using this parser with the Syndication.photoslibrary database you will have more records than you will get with the PhotoData/Photos.sqlite. This parser should be used as the first parser/analysis point when investigating the Syndication Photos Library Shared with You Conversation data. This parser will include the following fields to provide insight into Shared with You Conversation thread and identifiers (emails and phone numbers) used within those conversations. Most of this parsed data is very similar to album data with the exception of the following additional fields:

  • zAsset zConversation – This is the foreign key used to join an asset record to the zGenericAlbum record.
  • zGenericAlbum zImportSessionID – This field will include an Apple ID email or phone number that could be used to identify a participant in the conversation.

Ph26SyndicationPLAssets.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS15-18):

Apple.com Syndication Photos Library – Shared with You

This iLEAPP parser and embedded SQLite query will provide asset records for assets that are a part of a Syndication Photos Library Shared with You conversation. This will include asset records and conversation album records for assets that have a zGenericAlbum zKind (1509) value indicating the asset is in a Shared with You Conversation. The following are some of the fields listed within the parser output:

  • Basic Asset data: asset dates, file names, SPL status, unique identifiers, and others 
  • Shared with You Conversation Album data: including dates, trash status, unique identifiers, and others
  • zAsset zConversation – This is the foreign key used to join an asset record to the zGenericAlbum record.
  • zGenericAlbum zImportSessionID – This field will include an Apple ID email or phone number that could be used to identify a participant in the conversation.
  • zAsset zSyndicationState – Based on my ongoing testing and research can provide insight particular actions that occurred with an asset.

Ph30-39 Photos.sqlite iLEAPP Parsers:

Parser names starting with Ph30-39 are parsers that will allow for analysis of assets related to iCloud Shared Links and iCloud Shared Photo Library (SPL). These parsers will provide data related to iCloud Shared Links also known as Cloud Master Moments and data indicating who shared the assets. These parsers will also allow for analysis of data related to iCloud Shared Photo Library, including data indicating the owner / sharing participant of an asset, and the status of invites to the iCloud Shared Photo Library.

Ph30iCloudShareMethodsNAD.py:

PhotoData/Photos.sqlite (iOS14-18):

Apple.com Share Photos and Videos

This iLEAPP parser and embedded SQLite query will provide share records from the zShare table within Photos.sqlite database. This parser will not contain specific asset data (NAD), only general share record data from the zShare and zShareParticipant table. The following are some of the fields listed within the parser output:

  • zShare Creation Date – when the share was created
  • zShare Start Date – date for the first asset (chronology) in the share
  • zShare end Date – date for the last asset (chronology) in the share
  • zShare Expiry Date – date the share link will expire
  • zShare Scope Type – will indicate if the share is an iCloud Link share or iCloud Shared Photo Library is being used. There might be other share methods that exist, additional testing is ongoing.
  • zShare counts – different counts exist depending on the type of share.
  • zShare Title – will indicate if the share record is for an iCloud Shared Photo Library invite
  • zShare URL – will contain URL for the shared record
  • zShareParticipant Acceptance Status – will indicate if the invite has been accepted
  • zShareParticipant identifiers – Multiple columns will contain zShare Participant identifiers such as, User ID, Email, Phone Number
  • And other columns which provide insight into the sharing record

Ph31iCloudSharePhotoLibraryNAD.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS16-18):

This iLEAPP parser and embedded SQLite query will provide share records from zShare table within Photos.sqlite database for only an iCloud Shared Photos Library record and invites. This parser will not contain asset data (NAD), only general share record data from the zShare and zShareParticipant table. This parser will contain the same columns as Ph30iCloudShareMethodsNAD.py but will include a few extra data columns related to iCloud Shared Photos Library (SPL).

Ph32AssetsIniCldSPLwContrib.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS16-18):

Apple.com iCloud Shared Photo Library

This iLEAPP parser and embedded SQLite query will provide basic asset data for assets that are in the iCloud Shared Photo Library. This parser will also contain data to provide insight into identifiers for an asset contributor to the iCloud Shared Photos Library. In addition to some of the basic asset data already mentioned in Ph2BasicAssetandAlbumData.py, this parser will include the following additional data columns:

  • zAsset Active Library Scope Participation State – will indicate if the asset is in an active SPL
  • zAsset Trash Date – will indicate a date when the asset was marked as recently deleted
  • zAsset Trashed by Participant – will indicate which iCloud Shared Photos Library Participant marked the asset as recently deleted. This column will contain an integer which can be joined with zShareParticipant z_PK value. See the other zShareParticipant column data for additional data about the participant.
  • zShare Participant Role – will indicate the participant role. Based on my testing and interpretation, I have identified these roles as an owner role and an invitee role.
  • zAssetContributor zParticipant – This column will contain an integer which can be joined with zShareParticipant z_PK value.
  • zShare Cloud Photo Count – number of photos in the iCloud Shared Photo Library
  • zShare Count of Assets Added by Camera Smart Sharing – number of assets added to the iCloud Shared Photos Library when captured with the Apple Camera Application when the automatic share to iCloud Shared Photo Library setting was ON.
  • zShare Cloud Video Count – number of videos in the iCloud Shared Photo Library
  • And other columns related to the asset and the iCloud Shared Photo Library record and participant.

If you would like to use Ph32AssetsIniCldSPLwContrib.py parser returned data to isolate only those assets the examined device Apple ID has contributed to the iCloud Shared Photos Library, use the following field and search term. When using Timeline Explorer you use the column sort to surface the assets:

  • SPLzShareParticipant Is Current User – search term “1-Participant-Is_CurrentUser-1”

Ph33AssetsIniCldSPLfromOtherContrib.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS16-18):

This iLEAPP parser and embedded SQLite query will provide basic asset data for assets that are in the iCloud Shared Photos Library. This parser will also contain data which provide insights into identifiers for an asset contributor to the iCloud Shared Photos Library. This parser will isolate those assts that have been shared to the iCloud Shared Photos Library from participants other than the examined device Apple ID. This can be investigated by analyzing the following columns:

  • SPLzShareParticipant Is Current User – should indicate Not Current User
  • SPLzShareParticipant Role – should indicate Invitee Role
  • SPLzShareParticipant Email Address – should not match the examined device Apple ID
  • SPLzShareParticipant Phone Number – should not match the examined device Apple ID

Ph34iCloudSharedLinksNAD.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS14-18):

This iLEAPP parser and embedded SQLite query will provide share records from zShare table within Photos.sqlite database for only iCloud Shared Links. This parser will not contain asset data (NAD), only general share record data from the zShare and zShareParticipant table. This parser will contain the same columns as Ph30iCloudShareMethodsNAD.py but will include a few extra data columns related to iCloud Shared Links. Some columns I would suggest to review and analyze:

  • zShare Asset Count
  • zShare Photos Count
  • zShare Uploaded Photos Count
  • zShare Videos Count
  • zShare Uploaded Videos Count
  • zShare Share URL
  • zShare Public Permission

Ph35iCloudSharedLinkAssets.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS14-18):

Apple.com iCloud Shared Links

This iLEAPP parser and embedded SQLite query will provide basic asset data for assets related to iCloud Shared Links. In addition to basic asset data this parser will include additional data providing insights into the iCloud Shared Link.

Ph50-55 Photos.sqlite iLEAPP Parsers:

Parser names starting with Ph50-55 are parsers that will allow for analysis of assets and data stored within the Photos.sqlite zInternalResource table. These parsers contain some of the largest SQLite queries for the Photos.sqlite databases. Ph50AssetIntResouData.py parser should be used as the initial/primary parser to start a detailed analysis for an asset. These parsers will not only provide you Photos.sqlite data about the main asset, but will also provide you data about other related media files such as files that make a Live Photo, Metadata files, Thumbnails, and others. Please review my blog about asset optimization and if you have a full sized file for more information.

When using these parsers and their output I would strongly encourage you to use the iLEAPP Tab Separate Value (TSV) reports and Eric Zimmermans Tools – Timeline Explorer. When conducting an analysis, I would encourage you to use the search feature and an asset filename or zAsset z_PK to isolate the data for an individual asset you are analyzing. This should provide you with the data to start the analysis and determine if the asset full size file is on the device or if the asset has been optimized and synced with iCloud Photos.           

Ph50AssetIntResouData.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS14-18):

Apple.com Optimize iPhone Storage

This iLEAPP parser and embedded SQLite query will provide asset record data and internal resource file data. This parser will include record data for full-sized assets available locally on the device and for asset that have been optimized for iPhone storage. The following are some of the key fields listed within the parser output for analysis:

  • zInternalResource Local Availability – will indicate if an asset / internal resource is available locally
  • zInternalResource Recipe ID – will indicate the type of asset / internal resource associated with the zInternalResource Fingerprint. Based on my research and testing I have interpreted/decoded these Recipe IDs into a plain language definition.
  • zAdditionalAssetAttributes Master Fingerprint – will indicate the fingerprint for the main / full sized asset.
  • zInternalResource Fingerprint – will indicate the fingerprint for the main asset and other associated internal resource files. One of the internal resource fingerprints will match the additional asset attributes master fingerprints and will indicate the main / primary asset.
  • zInternalResource Type, Datastore Sub-Type, and Cloud Source Type – will indicate the internal resource type that can be used in conjunction with the zAsset Kind Sub-Type and zAdditionalAssetAttributes Cloud Kind Sub-Type to determine the type of file being analyzed.

Ph51PossOptimizedAssetsIntResouData.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS14-18):

This iLEAPP parser and embedded SQLite query will provide asset record data and internal resource file data. This parser will contain limited record results. This parser can be used as a quick way to identify asset records for assets that have been Optimized for iPhone Storage and the full-size file is not available locally on the device. To conduct a detailed analysis please use the Ph50AssetIntResouData.py parser. This parser is based on the values stored in the following fields.

Ph70UserAdjustDateTimezoneLocation.py – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

Apple Change the Date, Time, or Location

This iLEAPP parser and embedded SQLite query will provide asset record data for assets where there is a conflict between some of the data recorded in the Photos.sqlite database. These conflicts could occur for a number of reasons and a full data analysis should be conducted for each individual asset to verify why the conflict exists. This parser only applies to the */PhotoData/Photos.sqlite database.

One of the reasons the conflict could exist is due to a user making adjustments to the assets via the Apple Photos Application (com.apple.mobileslideshow). Ian Whiffin published a blog discussing this Apple Photos feature. Within his blog he published a table of fields within Photos.sqlite that can be analyzed to determine if that might have been some user interaction with the data:

ZASSET TABLE
ZDATECREATEDAffected by the change
ZLATITUDEAffected by the change
ZLONGITUDEAffected by the change
ZADDEDDATEUnaffected
ZANALYSISSTATEMODIFICATIONDATEUnaffected
ZADDITIONALASSETATTRIBUTES TABLE
ZSCENEANALYSISTIMESTAMPAffected by the change
ZTIMEZONEOFFSETAffected by the change
ZTIMEZONENAMEAffected by the change
ZREVERSELOCATIONDATAAffected by the change
ZGPSHORIZONTACCURACYAffected by the change (Now shows -1)
ZEXIFTIMESTAMPSTRINGUnaffected
ZCLOUDMASTERMEDIAMETADATA TABLE
ZDATAUnaffected
ZEXTENDEDATTRIBUTES TABLE
ZLATITUDEUnaffected
ZLONGITUDEUnaffected

After doing some recent research, I was able to confirm what Ian published in his blog. I also created a reference table listed below containing the files and results observed. The items in bold in the table are fields that contain data unaffected by the adjustments. These fields can be compared to the affected data fields and can provide insights into possible data adjustments.

TableColumn / FieldAffected or Unaffected
ZASSETZDATECREATEDAffected – changed value
ZASSETZSORTTOKENAffected – changed value
ZASSETZLATITUDEAffected – changed value
ZEXTENDEDATTRIBUTESZLATITUDEUnaffected
ZASSETZLONGITUDEAffected – changed value
ZEXTENDEDATTRIBUTESZLONGITUDEUnaffected
ZASSETZADDDATEUnaffected
ZADDITIONALASSETATTRIBUTESZREVERSELOCATIONDATA – embedded plist dataAffected – changed value
ZADDITIONALASSETATTRIBUTESZGPSHORIZONTACCURACYAffected – changed value
ZADDITIONALASSETATTRIBUTESZTIMEZONENAMEAffected – changed value
ZADDITIONALASSETATTRIBUTESZTIMEZONEOFFSETAffected – changed value
ZADDITIONALASSETATTRIBUTESZINFERREDTIMEZONEOFFSETUnaffected
ZADDITIONALASSETATTRIBUTESZEXIFTIMESTAMPSTRINGUnaffected
ZINTERNALRESOURCEZCLOUDMASTERDATECREATEDMixed results
ZCLOUDMASTERZCREATIONDATEAffected – changed value
ZCLOUDMASTERZIMPORTDATEUnaffected
ZCLOUDMASTERMEDIAMETADATAZDATA – embedded plist dataUnaffected

I also created an iLEAPP parser with embedded SQLite queries that might help with the analysis of these possible user adjustments/data conflicts. The below sections will describe the iLEAPP parser, the embedded SQLite queries, and the data being compared within each parser. In order to compare the date created to the EXIF timestamp a conversion has been written into the SQLite query. The date created is being converted to local time. If you believe the device being examined has a frequency of traveling across time zones some of these parsers might not be applicable. For additional details please see the embedded SQLite query.  

Ph70.1-Possible_Adjust_Date-Timezone-Location-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZASSETZDATECREATEDNot Equal ToZADDITIONALASSETATTRIBUTESZEXIFTIMESTAMPSTRING
AND
ZADDITIONALASSETATTRIBUTESZTIMEZONEOFFSETNot Equal ToZADDITIONALASSETATTRIBUTESZINFERREDTIMEZONEOFFSET
AND
ZASSETZLATITUDENot Equal ToZEXTENDEDATTRIBUTESZLATITUDE

This parser and SQLite query result will include assets that meet all three conditions:

  • Date Created does not equal the Exif Timestamp String; and
  • Time Zone Offset does not equal Inferred Time Zone Offset; and
  • zAsset Latitude does not equal zExtended Attributes Latitude

Ph71.1-Possible_Adjust_Date-Timezone-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZASSETZDATECREATEDNot Equal ToZADDITIONALASSETATTRIBUTESZEXIFTIMESTAMPSTRING
AND
ZADDITIONALASSETATTRIBUTESZTIMEZONEOFFSETNot Equal ToZADDITIONALASSETATTRIBUTESZINFERREDTIMEZONEOFFSET

This parser and SQLite query result will include assets that meet two conditions:

  • Date Created does not equal the Exif Timestamp String; and
  • Time Zone Offset does not equal Inferred Time Zone Offset

Ph72.1-Possible_Adjust_Date-Location-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZASSETZDATECREATEDNot Equal ToZADDITIONALASSETATTRIBUTESZEXIFTIMESTAMPSTRING
AND
ZASSETZLATITUDENot Equal ToZEXTENDEDATTRIBUTESZLATITUDE

This parser and SQLite query result will include assets that meet two conditions:

  • Date Created does not equal the Exif Timestamp String; and
  • zAsset Latitude does not equal zExtended Attributes Latitude

Ph73.1-Possible_Adjust_Date-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZASSETZDATECREATEDNot Equal ToZADDITIONALASSETATTRIBUTESZEXIFTIMESTAMPSTRING

This parser and SQLite query result will include assets that meet one condition:

  • Date Created does not equal the Exif Timestamp String

Ph74.1-Possible_Adjust_Timezone-Location-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZADDITIONALASSETATTRIBUTESZTIMEZONEOFFSETNot Equal ToZADDITIONALASSETATTRIBUTESZINFERREDTIMEZONEOFFSET
AND
ZASSETZLATITUDENot Equal ToZEXTENDEDATTRIBUTESZLATITUDE

This parser and SQLite query result will include assets that meet two conditions:

  • Time Zone Offset does not equal Inferred Time Zone Offset; and
  • zAsset Latitude does not equal zExtended Attributes Latitude

Ph75.1-Possible_Adjust_Timezone-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZADDITIONALASSETATTRIBUTESZTIMEZONEOFFSETNot Equal ToZADDITIONALASSETATTRIBUTESZINFERREDTIMEZONEOFFSET

This parser and SQLite query result will include assets that meet one condition:

  • Time Zone Offset does not equal Inferred Time Zone Offset

Ph76.1-Possible_Adjust_Location-PhDaPsql – Excluded By Default:

PhotoData/Photos.sqlite (iOS15-18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases with a possible data conflict or data that has been adjusted by a user. If an asst is listed as the result of this parser / SQLite query, it does not guarantee a user made adjustment(s), but should provide information / data for investigative leads and further analysis. This parser will compare the following columns and if the data does not match the asset will be listed as the result:

TableField TableField
ZASSETZLATITUDENot Equal ToZEXTENDEDATTRIBUTESZLATITUDE

This parser and SQLite query result will include assets that meet one condition:

  • zAsset Latitude does not equal zExtended Attributes Latitude

Ph90-99 Photos.sqlite iLEAPP Parsers:

Parser names starting with Ph90-99 are the largest Photos.sqlite parsers and embedded SQLite queries. Each parser will be for a specific iOS version. Each parser and embedded SQLite query will include the most complete decoding that have I have been able to complete based on my research. These parsers will also include several fields / columns of data that I am still testing and working on decoding.

Within iLEAPP these parsers have been excluded from default and require additional user input to run. I would encourage using these parsers during a separate instance of iLEAPP. These parsers can take several minutes to complete.

The parser / SQLite query results will include several records for each asset listed in the zAsset table. This is due to the number an asset can be listed within each table from Photos.sqlite. A few examples of this include but not limited to:

  • One asset can be listed in multiple Memory, Moment, and/or Highlight
  • One asset can be related to multiple Internal Resource files
  • One asset can be related to multiple Albums
  • One asset can contain multiple faces
  • One asset can contain multiple people
  • One asset can have multiple duplicates

Ph94iOS14REFforAssetAnalysis.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS14):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases. This parser should be used as the main source for a detailed analysis of the Photos.sqlite database and related assets.

Ph95iOS15REFforAssetAnalysis.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS15):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases. This parser should be used as the main source for a detailed analysis of the Photos.sqlite database and related assets.

Ph96iOS16REFforAssetAnalysis.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS16):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases. This parser should be used as the main source for a detailed analysis of the Photos.sqlite database and related assets.

Ph97iOS17REFforAssetAnalysis.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS17):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases. This parser should be used as the main source for a detailed analysis of the Photos.sqlite database and related assets.

Ph98iOS18REFforAssetAnalysis.py – Excluded By Default:

PhotoData/Photos.sqlite & Syndication.photoslibrary/database/Photos.sqlite (iOS18):

This iLEAPP parser and embedded SQLite query will provide asset record data for assets listed within Photos.sqlite databases. This parser should be used as the main source for a detailed analysis of the Photos.sqlite database and related assets.

The parser file names may contain additional indicators about the artifact being parsed. If you have questions about the parsers please visit my blog, https://theforensicscooter.com/, or email me at, forensicscooter@gmail.com.

One thought on “iLEAPP Parsers & Photos.sqlite Queries

Leave a comment