Digital Legacy Quick Prototype To-Do List
This list outlines major steps for a rough start on the digital legacy prototype, focusing on free tools and <1hr total where possible. Major steps are bolded; minors are indented. Checkboxes ([ ]) for Obsidian Tasks plugin. Time estimates are conservative for a tech-savvy user; total ~45-60min excluding grief reading.
-
Install and Set Up Polycam App on Google Pixel (Time: 5-10min)
- Download from Google Play Store (search “Polycam”); free version sufficient for basics. (2min)
- Grant camera/storage permissions; sign up with email if prompted (optional for guest mode). (2min)
- If issues: Check app support at https://polycam.zendesk.com or Reddit r/Polycam for Pixel-specific tips (e.g., clear cache if crashes). (1-5min)
-
Perform Rough Interior House Scan (Time: 10-15min)
- Open Polycam; select “New Scan” > Photogrammetry mode (walk-around capture). (1min)
- Walk slowly around one room (e.g., living room), holding Pixel steady; app auto-captures photos—aim for even lighting, avoid reflections. (5-10min)
- Process in-app (tap “Finish”); export as GLB file to device storage. (2-3min)
- If blurry/edges: Retake in better light; see YouTube “Polycam Android tutorial” for fixes (e.g., slower movement). (2min extra if retry)
-
Add Rough Avatar to Scan (Time: 5-10min)
- In Polycam, use “Avatar” or photo upload tool (if available; else import selfie via edit mode). (2min)
- Take/upload a front-facing selfie; app generates basic 3D model—place in scanned room via drag. (3-5min)
- Export combined GLB; view in Polycam or free viewer app like Google Scene Viewer. (1-2min)
- If no avatar feature: Skip to Blender import later; alternative free app KIRI Engine for photo-to-3D (similar time). (2min extra)
-
Gather Corpus from Quickest Sources (Time: 10-15min)
- HN Comments (Quickest New Source): Use Google BigQuery public dataset. (5-10min)
- Go to console.cloud.google.com/bigquery; create free sandbox account if none (no billing). (2min)
- Run SQL:
SELECT text FROMbigquery-public-data.hacker_news.fullWHERE author = 'your_username' AND type = 'comment';—replace ‘your_username’; export as CSV. (3-5min) - If auth/issues: Manual fallback—visit news.ycombinator.com/user?id=your_username, copy comments page-by-page (slower, 5-10min extra).
- Existing Twitter/FB Downloads: Locate your old JSON exports. (2-3min)
- Parse with jq (install if needed:
sudo apt install jqon Ubuntu); e.g.,jq '.[] | .text' twitter.json > twitter.txt. (2min) - If format errors: Open in text editor, manual clean (e.g., grep for “text”:). (1-2min extra)
- Parse with jq (install if needed:
- Blogs/Books (If Quick): Manual copy from your sites/files to TXT; skip if >5min. (Optional, 3-5min)
- HN Comments (Quickest New Source): Use Google BigQuery public dataset. (5-10min)
-
Plug Corpus into Simple Local Setup (Time: 5-10min)
- Concat files: Bash
cat hn.csv twitter.txt fb.txt > corpus.txt(clean headers first). (2min) - For basic index: Use SQLite—
sqlite3 corpus.db "CREATE TABLE texts (id INTEGER PRIMARY KEY, content TEXT);"then.import corpus.txt texts(assume tab-delim or adjust). (3-5min) - Test query:
sqlite3 corpus.db "SELECT * FROM texts LIMIT 5;"for view. (1min) - If issues: Fallback to plain TXT search with grep; for integration, later link to 3D via simple script. (2min extra)
- Concat files: Bash
-
Optional: Import to Blender for Viewing/Editing (Time: 15-30min if pursued)
- Install on Ubuntu:
sudo apt update && sudo apt install blender. (5-10min) - Open Blender; import GLB (File > Import > glTF). (2min)
- Basic view/edit: Follow blender.org/docs quick-start (navigate viewport, simple position). (5-10min)
- If steep: YouTube “Blender beginner import GLB” (10min tutorial).
- Install on Ubuntu:
Sources on Grief Extension and Obsession in Digital Afterlife
“Navigating Grief in the Digital Age” from Psychology Today (2024) explores how social media, virtual funerals, and online memorials reshape mourning, emphasizing both connection and challenges like prolonged exposure to loss. Praised for its accessible insights into therapeutic benefits, such as shared tributes fostering community support, it draws on real-world examples to highlight positive adaptations in grief processes. Critics argue it underplays risks like digital overload leading to obsession, focusing more on upsides without deep empirical data on long-term psychological harms. Link: https://www.psychologytoday.com/us/blog/mental-health-nerd/202408/navigating-grief-in-the-digital-age
“AI and the Afterlife: The Ethical and Emotional Costs of Digital Resurrection” from VKTR (2025) delves into AI recreations of the deceased, discussing consent, grief commercialization, and emotional dependency. It’s lauded for balancing ethical debates with case studies, like AI chatbots extending bonds but risking stalled mourning, providing a cautionary framework for tech creators. However, some criticize its alarmist tone, potentially overstating psychological tolls without sufficient longitudinal studies, while overlooking cultural variations in afterlife tech acceptance. Link: https://www.vktr.com/ai-ethics-law-risk/when-ai-brings-back-the-dead-balancing-comfort-and-consequences/
“THE AFTERLIFE IN THE AGE OF AI: A PSYCHOLOGICAL, ETHICAL, AND PHILOSOPHICAL ANALYSIS” from IJCI (undated PDF) analyzes digital memorials through attachment theory and grief frameworks, warning of obsession via perpetual interactions mimicking the dead. Praised for interdisciplinary depth, integrating philosophy (e.g., identity continuity) with psychology to argue for regulated designs, it offers thoughtful critiques of current tech. Detractors note its speculative nature, lacking large-scale evidence, and potential bias toward restricting innovation without exploring user-empowering benefits. Link: https://ijcionline.com/paper/14/14325ijci04.pdf