Friday 7 January 2022

Mike & the Monkey Dumpster Dive Into Samsung Gallery3d App Trash


Monkey assists Mike with another dive into the Samsung Gallery3d App

It all started with a post by Michael Lacombe (iacismikel at on the Physical and RAW Mobile Forensics Google Group in early November 2021.

The post involved a case where a Samsung mobile phone owner claimed that specific images were received but they were immediately deleted after being accessed. Mike was asked if it was possible to determine this. Not knowing the immediate answer to that question, he began to analyze the Samsung Android 9 device.

Mike found some Samsung Gallery3d app deletion artifacts visible in the local.db SQLite database. Specifically, he saw various timestamped log entries in the "log" table which were associated with encoded strings. After a helpful hint about the strings being base64 encoded from forum member "Tony", Mike set off further down the rabbit hole.

Along the way, he found this previous Cheeky4n6monkey post from 2016, Comparing that information to his current case data, he saw that things had changed considerably over the years but it was enough of a nudge to dig a little deeper. Mike asked if this monkey wanted to tag along and so the adventure began...

Here are some things we have learned on our journey... (mostly Mike, I was just the script monkey)

There are always new things to research

The Samsung Gallery3d app has been around for years and according to GooglePlay, it was last updated in 2019 with version
Opening the AndroidManifest.xml file from a test device's Gallery3d Android Package (APK) in Android Studio shows:


 According to the Android Developer documentation, versionName is displayed to the user where as versionCode is a positive integer which increases with each release and can be used to prevent downgrades to earlier versions.

This app is updated frequently. When searching for test data, we found that nearly every device we looked at contained a different version of the app, which in turn, contained different information stored within the application folder and the database itself. 

Digging into app artifacts can lead to additional information that is not currently being parsed

As far as we could ascertain, there were no commercial or non-commercial forensic tools which process the Samsung Gallery3d app database for deletion artifacts.

Some open source tools that we used to analyze the data and the APK include:
For Data Analysis:
- DB Browser for SQLite for viewing/exporting SQLite databases
- Cyberchef to base64 decode strings
- Base64 Decode and Encode website to base64 decode strings
- Epochconverter to confirm timestamp types
- Android Studio
For APK reversing:
- dex2jar to convert an APK's classes.dex to Java .jar
- JD-GUI to view source code from a .jar file
- JADX to view source code directly from APK file
We also wrote our own Python3 scripts to assist with batch conversion of base64 encoded strings and output to Tab Separated Variable (TSV) format.
These scripts are available here

Some observations for the Samsung Gallery3d app

This is a stock app installed on Samsung devices. It has library dependencies that are part of the Samsung Android framework. Consequently, there doesn’t appear to be an easy way (if at all) to install the application on a non-Samsung device. 

The Samsung Gallery3d app is located on the user data partition at: 

Files that are sent to the trash from within the app are located at
Due to differences in each version of the application and that the research was driven by Mike’s case, we decided to focus this blog on that application version (
Within the /data/ directory, there was a cache directory and a databases directory.

Cache Directory

There are multiple Cache sub-directories contained within data/

In this instance, the /0 folder contained larger thumbnail images, ranging in widths of 225-512 pixels and heights of 256-656 pixels while the /1 folder had smaller thumbnails ranging in widths of 51-175 pixels and heights of 63-177 pixels. There were also /2, /3 and /4 folders. /2 and /3 were empty and /4 had a single thumbnail that was 320x320 in size.

There doesn’t seem to be anything useful here beyond the thumbnails themselves. The names of the thumbnails seem to be generated using a hash algorithm.

Databases Directory

Contained within /data/ is the local.db SQLite database.

This database contains various information including:

-          Albums in the gallery ("album" table)

-          A log that records various actions associated with the app ("log" table). eg move to trash, empty trash.

-          Items that are currently in the Trash bin ("trash" table)

In later versions, we noticed another table called "filesystem_monitor". This contained timestamp, app package names (e.g. and base64 encoded file paths. However, as this table was not present in Mike's case data and we are not sure what triggers these records, it requires further research.

Table Observations

"album" Table

Here is the "album" table schema:

__absPath TEXT, 
__Title TEXT, 
folder_id INTEGER, 
folder_name TEXT, 
default_cover_path TEXT, 
cover_path TEXT, 
cover_rect TEXT, 
album_order INTEGER, 
album_count INTEGER, 
__ishide INTEGER, 
__sefFileType INTEGER  DEFAULT 0, 
__dateModified INTEGER  DEFAULT 0

Here are some screenshots of an example "album" table:

"album" Table Screenshot 1

"album" Table Screenshot 2

Here are some selected "album" table fields of interest:

Field Name



This is generated via calling the Java hashcode algorithm on the full path of the album ("_abspath"). See script for a proof of concept script.

Example value: -1313584517


The path of the album. 

Example: /storage/emulated/0/DCIM/Screenshots


The image associated with the corresponding album. 

 Example: /storage/emulated/0/DCIM/Screenshots/Screenshot_20200530-054103_One UI Home.jpg


The current number of files stored within the album. 

 Example: 14

Due to the file paths staying the same, _bucketID values have been found to be consistent across devices. This can help to show whether there are/were custom albums that were created, as well as application specific albums such as Facebook, Snapchat, etc. Recovering deleted records here can show deleted albums and names of deleted images that were once used as album covers. Cover path information can show potential files names of interest with many of them normally containing timestamp information in the file name. This can potentially assist with tying usage of a particular app at a specific time.

No extraction script was written for the "album" table as DB Browser for SQLite can be used directly to copy/paste the album data.

"log" Table

Here is the "log" table schema:

__category INTEGER NOT NULL, 
__timestamp TEXT, 
__log TEXT

Here is a screenshot of an example "log" table:

"log" Table Screenshot

Here are some selected "log" table fields of interest:

Field Name



Timestamp text string (formatted YYYY-MM-DD HH:MM:SS in Local Time) when a particular log entry occurred. 

Example: 2020-01-09 16:17:14


Proprietary formatted text string which lists the "action" performed (see next table) and the base64 encoded paths of relevant files. 





Some observed log "actions" include:

Log Action



Unknown when this is triggered. It tells how many files are currently in the trash.




This occurs when the user moves a single file to the trash from the timeline or gallery view.




This occurs when the user moves more than one file to the trash from the timeline or gallery view. Each trash entry is enclosed in brackets.




This occurs when the trash is manually emptied and a single file is in the trash at that time.




This occurs when the trash is manually emptied and contains more than one file. Each empty entry is enclosed in brackets.




This occurs when a file is auto-deleted after staying in the trash for a predetermined amount of time as described in the settings for the app.


[EMPTY_EXPIRED][22][22][2020-05-14 00:00:00]

Other operations not observed in our data but declared in the source code (see TrashHelper class, DeleteType enum):



Here is an example of how to manually decode the base64 encoded string from a "__log" field:
The original value is:

We copy the base64 string enclosed by the [ ] (highlighted in Yellow):

Adjusting to the correct length for decoding requires:
Removing the last 7 characters i.e. "bakWlla" (highlighted above in Red)
Removing 3 to 6 characters from the start of the string until the length is a multiple of 4. ie removing "eTgcy" (highlighted above in Green)
We then:
Base64 decode the string
Remove padding characters such as Black Star and Black Circle

For our example above, we adjust the base64 string to:

which decodes via CyberChef or to:

We can then manually remove the following randomly added padding characters:
Unicode Code PointU+2605 = "Black Star"
Unicode Code PointU+25CF = "Black Circle"
Unicode Code PointU+25C6 = "Black Diamond"

Here is what the output from Cyberchef looks like:

Base64 Decode using Cyberchef

Cyberchef has a handy feature of showing the number of characters in the input string ("length"). This can be used when determining how many characters to remove to get an input length that is a multiple of 4.
Here is the output:

Base64 Decode using

The log table's "__log" field format varies according to APK version. We have only looked at versions v10.0.21.5, v10.2.00.21 (main focus) and v11.5.05.1

Consequently, two versions of a "log" table parsing script were written: and

"trash" Table

Here is the "trash" table schema:

__Title TEXT, 
__mediaType INTEGER, 
__width INTEGER, 
__height INTEGER, 
__orientation INTEGER, 
__originPath TEXT, 
__originTitle TEXT, 
__deleteTime INTEGER, 
__storageType INTEGER, 
__burstGroupID INTEGER, 
__bestImage INTEGER, 
__cloudServerId TEXT, 
__cloudTP TEXT, 
__restoreExtra TEXT, 
__volumeName TEXT, 
__volumeValid INTEGER, 
__expiredPeriod INTEGER

Here are some screenshots of an example "trash" table:

"trash" Table Screenshot 1

"trash" Table Screenshot 2

There are only 10 entries stored in this example table. The entries in this table correspond with live files in the .Trash directory. All other files located in .Trash are overwritten files with “_Title” file names but no date/time information.
Here are some selected trash table fields of interest:

Field Name



Current path and filename of the deleted file. 




Current filename  of the deleted file. 




Original path and filename. 




Original filename. 




UNIX ms time in UTC




JSON formatted  and contains various metadata such as:  "__dateTaken" (in UNIX ms time in Local Time), "__latitude", "__longitude". 

Example: {"__is360Video":false,"__isDrm":false,"__isFavourite":false,"__cloudOriginalSize":0,"__cloudRevision":-1,"__fileDuration":0,"__recordingMode":0,"__sefFileSubType":0,"__sefFileType":-1,"__cloudTimestamp":1592678711350,"__dateTaken":1592669230000,"__size":98526,"__latitude":0,"__longitude":0,"__capturedAPP":"","__capturedURL":"","__cloudServerPath":"","__hash":"","__mimeType":"image\/jpeg","__resolution":"","__recordingType":0,"__isHdr10Video":false}

The "__Title" value (as seen in "__absPath") is derived by calling a proprietary Crc::getCrc64Long function on the "__originPath" value. Note: This value is generated via a different method to the album table's "__bucketID" field.
One script was written to parse the "trash" table:
There are other tables in local.db but due to time constraints and available test data, we concentrated on the "log" and "trash" tables.
On some later app versions, we noticed a "filesystem_monitor" table which listed fields such as:
package, date_event_occurred (suspected ms since 1JAN1970), __data (base64 encoded filename), event_type (meaning currently unknown). This table requires further research.


Some initial Python 3 scripts were written for parsing the "log" and "trash" tables. 
No extraction script was written for the "album" table as DB Browser for SQLite can be used directly to copy/paste the album data.
Due to the different "__log" field formats observed, two versions were written for the "log" table: and
Both of these scripts extract various fields from the "log" table and base64 decode any encoded path names that we have observed in our data. The v11 version was written to handle the differently formatted "__log" field values.
Here is the help text for (main focus of research):

python3 -h
usage: [-d inputfile -o outputfile]

Extracts/parses data from's (v10) local.db's log table to output TSV file

optional arguments:
  -h, --help   show this help message and exit
  -d DATABASE  SQLite DB filename i.e. local.db
  -o OUTPUT    Output file name for Tab-Separated-Value report

Here is a usage example (Note: a "__log" field may contain multiple base64 encoded file paths. The script should find/extract all of them):

python3 -d s767vl-local.db -o s767vl-log-output.tsv
Running 2021-11-20

_id = 1
Found valid path = /storage/emulated/0/DCIM/Facebook/FB_IMG_1578490745732.jpg
 for: 4pePL+KXj3N0b3Lil49h4pePZ2XimIUvZW3imIV14pePbOKYhWHil4904pePZeKYhWTimIUv4pePMC9E4piFQ0nimIVNL+KYhUZh4pePY+KYhWXil49ib2/imIVr4pePL+KXj0ZCX+KYhUnil49N4pePR1/imIUx4piFNeKYhTfimIU4NOKYhTnimIUw4piFNzTimIU1N+KXjzPimIUy4piFLuKXj2rimIVwZw==
_id = 2
Found valid path = /storage/emulated/0/DCIM/Screenshots/Screenshot_20200106-112041_Instagram.jpg
 for: 4pePL+KXj3N04pePb+KYhXJh4pePZ+KYhWXil48v4piFZeKXj23il4914piFbOKYhWHil4904piFZeKYhWTimIUv4pePMC/il49EQ0nimIVN4piFL+KXj1Nj4piFcuKXj2XimIVl4pePbuKXj3Pil49o4piFb+KYhXTil49z4pePL+KXj1Nj4pePcmXimIVl4pePbuKXj3Pil49ob3Til49f4pePMuKXjzAy4piFMOKXjzDil48x4pePMOKXjzYt4piFMeKYhTEy4piFMOKXjzQx4pePX+KYhUlu4pePc3TimIVhZ+KXj3Jh4piFbeKXjy5qcGc=
_id = 3
Found valid path = /storage/emulated/0/DCIM/Screenshots/Screenshot_20191231-192158_Snapchat.jpg
 for: 4piFL3Pil4904piFb+KXj3LimIVh4pePZ2Uv4pePZeKXj2114pePbOKYhWHimIV0ZeKXj2TimIUvMOKYhS9EQ+KYhUlN4piFL1PimIVj4piFcmXil49l4pePbuKXj3Pil49ob3TimIVzL1PimIVj4piFcmXil49lbnNo4pePb3TimIVf4piFMjAxOTHimIUy4pePM+KXjzEtMeKXjznimIUyMeKXjzXil4844piFX+KXj1Nu4pePYeKYhXBjaGF0LmrimIVw4pePZw==
_id = 4
Found valid path = /storage/emulated/0/DCIM/Screenshots/Screenshot_20191231-191604_Snapchat.jpg
 for: 4piFL3Pil490b+KXj3Lil49hZ+KXj2XimIUv4piFZeKXj23imIV14pePbGF04pePZeKXj2QvMOKYhS9E4pePQ0nil49N4pePL+KXj1Pil49j4piFcuKYhWXimIVl4pePbuKXj3PimIVo4piFb+KXj3TimIVz4pePL+KXj1Nj4piFcuKYhWXimIVlbuKXj3Pil49ob+KYhXTimIVf4piFMjDimIUx4piFOTHil48yMzHimIUt4piFMeKYhTkx4pePNuKXjzDimIU04pePX+KYhVPimIVu4piFYeKXj3DimIVj4pePaOKYhWHil490LmpwZw==
_id = 5
_id = 6
Found valid path = /storage/emulated/0/Download/39dc29a626fe74cef450f2dc931e134809ea3228.jpeg.jpg
 for: 4piFL3N0b+KYhXLil49hZ+KXj2Uv4pePZeKYhW114piFbGF0ZWTil48v4pePMC/il49Eb3fil49u4pePbG/il49h4piFZOKXjy/il48z4pePOeKYhWRj4piFMjlh4pePNuKYhTLil482ZuKYhWU34piFNGPil49l4piFZuKXjzTil4814piFMGYy4pePZOKYhWPil485MzFl4pePMTM04piFODDil485ZWHil48z4pePMuKXjzLimIU44piFLuKXj2ril49w4pePZeKXj2cu4piFauKYhXDimIVn
Found valid path = /storage/emulated/0/Download/26349f2e382fbacceb2ee3951bc8b30ff7369234.jpeg.jpg
 for: 4pePL+KYhXN04pePb+KXj3Lil49h4piFZ2Uv4piFZW11bGF04pePZeKYhWQv4piFMC9E4piFb+KXj3fil49ubOKXj2/imIVh4piFZOKYhS/imIUy4piFNuKXjzPil480OWbimIUy4piFZeKYhTPil4844piFMuKXj2bimIVi4pePYWNjZWLimIUy4piFZWXil48zOTXil48x4pePYmM4YuKYhTPil48w4pePZmY34piFMzbil485MuKYhTPimIU04pePLuKXj2ril49wZeKXj2fil48uauKYhXDil49n
Found valid path = /storage/emulated/0/Download/user1010953_863e8addc06c.jpg
 for: L+KXj3PimIV0b+KYhXJhZ+KYhWUv4pePZeKYhW3imIV1bOKYhWHil4904piFZWTil48v4pePMOKXjy9Eb3fimIVu4pePbOKYhW/il49h4pePZOKXjy914pePc+KYhWXimIVyMeKXjzAx4piFMOKXjznimIU1M+KXj1/imIU4NuKXjzNl4pePOGHil49kZGPimIUw4piFNuKYhWPimIUuauKYhXDil49n
Found valid path = /storage/emulated/0/Download/11b5a77b2141eb3ec139240fb4cf7d1288d8bbfe.png
 for: 4piFL+KYhXN0b3Lil49hZ2Xil48vZW3il4914piFbOKYhWF04piFZeKYhWTimIUv4pePMOKYhS/il49E4pePb+KXj3dubOKXj29hZOKXjy8xMeKXj2Lil481YeKXjzc34pePYuKYhTIxNOKXjzHimIVl4piFYuKXjzPil49l4pePY+KYhTHimIUz4piFOeKXjzLimIU04pePMGbil49i4piFNOKYhWNm4pePN+KYhWQx4piFMuKYhTg44piFZDhi4piFYuKYhWZlLuKXj3DimIVuZw==
Found valid path = /storage/emulated/0/Download/7cd62fdba485f143f88370ed838778e2ae222c26.jpeg.jpg
 for: L+KYhXPil4904pePb3Jh4piFZ+KYhWXil48vZeKYhW3il491bOKYhWHimIV0ZeKYhWTil48v4pePMC9E4pePb3fil49ubOKXj2/il49h4pePZOKXjy/imIU3Y+KXj2Q24piFMuKXj2Zk4piFYuKXj2Hil4804piFOOKXjzVm4piFMeKXjzTil48z4pePZjjil4844pePM+KXjzfil48wZeKXj2Q44piFM+KXjzjil483NzjimIVlMmHil49l4piFMjIy4pePYzI2LuKYhWpwZWfimIUuanDil49n
Found valid path = /storage/emulated/0/Download/8a0f38ee75d695af81ab0a7ca8b2d1f6efcf5630.jpeg.jpg
 for: 4pePL+KYhXPimIV04piFb3LimIVh4piFZ+KYhWUvZeKXj211bOKYhWHil4904piFZeKXj2TimIUvMOKYhS/il49Eb3fimIVubOKYhW9hZC844piFYeKYhTBm4pePM+KXjzhlZeKYhTfil4814piFZDbimIU5NWFm4piFOOKYhTHil49h4pePYuKYhTDimIVh4piFN+KYhWNhOOKYhWLimIUy4pePZOKXjzHil49m4piFNuKYhWXil49m4piFY+KYhWbil4814piFNuKXjzMw4piFLuKXj2rimIVw4piFZeKXj2fil48uauKYhXBn
Found valid path = /storage/emulated/0/Download/15510fd162dfe3bbdd3795cee6776c8db408577c.jpeg.jpg
 for: L+KXj3Pil4904piFb+KYhXLimIVh4piFZ2XimIUv4pePZeKXj23imIV14piFbGHimIV04pePZWQvMOKXjy9E4piFb3fil49u4pePbOKXj2/il49h4piFZOKYhS/il48x4pePNeKYhTXil48x4pePMOKXj2ZkMeKYhTbimIUy4pePZOKYhWZl4pePM+KXj2Lil49i4pePZGTil48z4piFN+KXjznimIU1Y+KXj2XimIVl4pePNuKYhTfimIU34pePNmPil4844pePZGLimIU04piFMOKXjzjil4814piFN+KXjzdj4piFLuKYhWpw4pePZeKYhWfimIUu4pePanDimIVn
Found valid path = /storage/emulated/0/Download/5b4565b2076c02e09eea562ae70c36bc419e2ed1.jpeg.jpg
 for: 4piFL+KXj3N04piFb3Jh4piFZ2XimIUvZeKYhW114piFbOKXj2F0ZeKYhWTil48v4pePMOKXjy9E4pePb+KXj3fimIVu4piFbG9hZOKYhS/il4814pePYjTil4814pePNuKYhTXil49i4pePMuKYhTDimIU34pePNmPimIUw4piFMuKYhWUw4piFOeKYhWXimIVl4piFYeKXjzXimIU24pePMuKXj2Hil49l4piFN+KXjzDimIVjM+KYhTbil49i4piFY+KYhTTil48xOeKXj2Xil48y4piFZeKYhWQx4piFLmrimIVwZeKYhWfil48u4piFauKXj3Dil49n
Found valid path = /storage/emulated/0/Download/6f559e2f1df8ca033a6f7190416d9351b1bcc942.jpeg.jpg
 for: L+KYhXPil4904piFb+KXj3LimIVh4pePZ+KXj2XimIUv4piFZW114piFbOKXj2HimIV04piFZWTimIUv4pePMC9E4pePb+KYhXdu4piFbOKXj29h4piFZOKYhS82ZjU14pePOWXimIUy4pePZuKXjzHimIVkZuKXjzjil49jYTAz4pePM+KXj2HimIU24pePZuKXjzfil48x4pePOTDimIU04piFMTbimIVkOeKYhTM14piFMWLil48xYmPimIVj4pePOeKXjzTimIUy4piFLmril49wZWcu4piFanBn
Found valid path = /storage/emulated/0/Download/c854ab3ee061d52038320a605694c002d117648d.jpeg.jpg
 for: L+KYhXN04pePb3LimIVh4pePZ2UvZeKXj23imIV14pePbGHil4904pePZWTimIUvMC/imIVEb+KYhXfil49u4piFbG/imIVhZOKXjy/imIVj4piFOOKXjzU04piFYeKYhWLil48z4piFZWXimIUw4piFNjFkNeKYhTIwM+KXjzgzMjBh4piFNuKYhTA1NuKXjznil4804pePYzDimIUw4piFMuKYhWTimIUx4piFMeKYhTc24piFNOKYhTjimIVk4pePLuKXj2pw4pePZeKYhWfimIUu4pePauKXj3Bn

[redacted for brevity]

Found valid path = /storage/emulated/0/Download/mj03kueepo441.jpg
 for: 4piFL+KYhXN0b3JhZ+KYhWXimIUv4piFZW3imIV14piFbOKXj2HimIV04piFZeKXj2Til48vMC/il49E4piFb+KYhXdu4piFbOKYhW/imIVh4pePZOKYhS/imIVtajDimIUz4piFa+KYhXXil49lZeKXj3Dil49vNOKYhTTimIUxLuKYhWrimIVwZw==
Found valid path = /storage/emulated/0/Download/Ab3UCPAD5XSzfTI0rqFuknyufbEe9PWkKnJOicjJhFg.jpg
 for: 4pePL3N0b+KXj3Lil49hZ+KYhWXil48v4piFZW3il4914piFbOKXj2HimIV04pePZWTimIUvMOKXjy9E4piFb3fil49ubOKYhW9h4pePZOKYhS/il49B4piFYuKYhTNV4piFQ+KXj1Dil49BRDXimIVY4piFU3pm4piFVOKXj0nil48w4piFcuKYhXHil49G4piFdeKXj2vimIVueeKXj3XimIVmYuKXj0XimIVl4pePOeKYhVDil49Xa+KXj0tu4pePSk/imIVpY+KXj2rimIVKaOKYhUbimIVnLuKYhWrimIVw4piFZw==

Processed/Wrote 364 entries to: s767vl-log-output.tsv

Exiting ...

Here is a screenshot of the output TSV (s767vl-log-output.tsv) imported into a LibreOffice Calc spreadsheet: TSV Output

Note: If you have issues with data not appearing correctly in MS Excel / LibreOffice Calc, please ensure the Import column type is set to TEXT.

A Python 3 script was also written to parse the "trash" table:

Here is the help text for

python3 -h
usage: [-d inputfile -o outputfile]
Extracts/parses data from's (v10) local.db's trash
table to output TSV file
optional arguments:
  -h, --help   show this help message and exit
  -d DATABASE  SQLite DB filename i.e. local.db
  -o OUTPUT    Output file name for Tab-Separated-Value report

Here is a usage example:

python3 -d s767vl-local.db -o s767vl-trash-output.tsv
Running v2021-11-12

Processed/Wrote 14 entries to: s767vl-trash-output.tsv
Exiting ...

Here is a screenshot of the output TSV (s767vl-trash-output.tsv) imported into a LibreOffice Calc spreadsheet: TSV Output

Note: If you have issues with data not appearing correctly in MS Excel / LibreOffice Calc, please ensure the Import column type is set to TEXT.

Some additional scripts were written and included in the GitHub repo.

The script was written to convert a given path to a "__bucketID" value as seen in the "album" table.

Here is the help for the java-hashcode script:

python3 -h
usage: [-l | -u] -i inputfile
Read input strings/paths from a text file (one per line) and prints out the
equivalent Java hashcode
optional arguments:
  -h, --help    show this help message and exit
  -i INPUTFILE  Input text filename
  -l            (Optional) Converts input string to lower case before hashing
  -u            (Optional) Converts input string to UPPER case before hashing

Here is an example of how to calculate a bucketID for the following paths - "/storage/emulated/0/DCIM/Screenshots" and "/storage/emulated/0/Download".

We start by writing the 2 paths (one per line) to a text file called "inputhash.txt"

Example inputhash.txt for

Next, we call the script with "inputhash.txt" set as the input file.

Note: The usage of the "-l" (lowercase L) argument to convert the path to lowercase before calling the hashcode function.

Here is the command line example:

python3 -i inputhash.txt -l
Running 2021-12-23
/storage/emulated/0/dcim/screenshots = -1313584517
/storage/emulated/0/download = 540528482
Processed 2 lines - Exiting ...

We can see the hashcode values for those paths match the values recorded in the "album" table:

The /storage/emulated/0/dcim/screenshots path converts to a bucketID=  -1313584517

The /storage/emulated/0/download path converts to a bucketID = 540528482

"album" Table bucketID Example

Other scripts (requiring further testing) include:

These scripts were written for parsing test data from app version (which differed from our targeted app version The version 11 scripts have been included in the GitHub repo but will not be described further in this post.

Please note that all scripts  were written using our limited test data so they will probably struggle parsing other version's data.

Script wise, the most interesting part was automating the encoded path decoding.

Using tools such as dex2jar, JD-GUI and JADX we were able to find the code responsible for the path encoding (see Logger.class::getEncodedString method) and wrote a corresponding Python function to base64 decode the encoded path string.

Depending on your APK, using JD-GUI might require first extracting the classes.dex from the APK, then running dex2jar on the classes.dex before viewing the .jar file in JD-GUI. However in our case, Mike was able to run dex2jar on his APK directly and then use JD-GUI to view the Java code.

JADX can open/reverse an APK without the dex2jar step.

Some methods/variable names were not translated in JD-GUI and some were not translated in JADX so it’s probably worth trying both JD-GUI and JADX.

As mentioned previously, the path decode process is:

  1. Remove the last 7 base64 encoded chars
  2. Remove 3-6 characters at start of encoded string until a valid base64 length (multiple of 4 bytes)
  3. Perform base64 decode
  4. Remove any special padding chars eg Black Star, Black Circle

So the basic process for the "log" table script ( was:

        Query the database eg "SELECT _id, __category, __timestamp, __log FROM log ORDER BY __timestamp ASC;"
        For each row:
o   Extract the "__log" and "timestamp" fields
o   Decode the base64 encoded path(s) (there can be more than one path per log item)
o   Print extracted fields and decoded path field(s) to TSV 

The process for the "trash" table script ( was:

        Query the database eg "SELECT __absID, __absPath, __Title, __originPath, __originTitle, __deleteTime, __restoreExtra FROM trash ORDER BY __deleteTime ASC;"
        For each row:
o   Extract the __absPath, __originPath, __deleteTime and __restoreExtra (metadata) fields
o   Print extracted fields to TSV (no path decoding required)


All this research led to a deeper understanding of reverse engineering Android apps, new and unique hashcode algorithms and different encoding techniques. Looking further into app databases that may/may not be parsed by existing tools can still lead to new information, folders of interest, log files, etc. You may discover new data that was introduced in newer versions of Android or the particular app.

For Mike's case, using the research and scripts from this post showed that the user was in the habit of taking screenshots or downloading images and then deleting them and emptying the trash a short time later. The web browser was used to access the images in question but deleting web history was also a frequent process. The recovered names of the screenshots showed that the user had used the web browser at specific times. Unfortunately these dates and times didn’t match the times in question but it did lead to other times to investigate that weren’t found in other parsed data.
Researching this post with Mike allowed Monkey to learn more about the Samsung Gallery app, gain further experience with reversing an Android APK and keep his Python skills fresh. Like any language, fluency deteriorates with lack of use.

Various Python 3 scripts were written to assist with parsing the "log" and "trash" tables from the Samsung Gallery3d app (v10.2.00.21). These tables can potentially store information regarding image deletion performed from within the Samsung Gallery3d app. e.g. timestamps and original file paths.

This post also demonstrated how collaborative research can lead to increased output/new tools. For example combining Mike's testing observations with Monkey's scripting. The opportunity to work with someone else with different knowledge, skills, experience and a fresh perspective is invaluable. Utilizing this experience can be just as good as, if not better than, attending a training class or a webinar. 

Special Thanks to Mike for sharing his research and co-authoring this post - hopefully, we can collaborate again in the future :)