This is NOT project 2

For project 2 we used Geolocation with p5.js to create a mobile experience that changes based on the users location


Photo of Me

My project when through many phases of ideas tests, sucesses and failures. Initially my idea was to have a mobile website where people could upload a selfie or photoscan of themselves and save it to a location for other people to see. Using Webgl, these pictures could be navigated by walking closer to their drop location and seeing various photoscans of people through their phone camera. The idea was to create a living and physical memory of locations and the people who have visited. Often a big city like LA can swollow people up and leave areas vacant and empty.

This idea was ultra ambitious so I tried to scale it back by prebuilding the Photoscans and using WebGL to place them at various parts of campus. I tried to find strangers willing to have their phototaken, and then uploaded their photoscan into p5.js useing WebGL. However I found WebGL to be too dificult to learn and work with during these short timeperiod.

I scaled back one again and rendered out the animations with an alpha channel so the photoscans would spin overtop of the mobile camera view. However, again the mobile camera code wasn't compatible with my iphone, even after updating.

The final product resulted in a series of 5 photoscanned people; 2 strangers, 2 friends, and myself. If a user was to walk to a specific part of campus (Jans steps, bruin bear, Broad, or Bruinwalk) they would see a spinning photoscan of a past visitor of that area.

I realize I took a little too ambitious of a project, but I now know a ton about P5.js and WebGL. I think in the future I would rather start small and test things to make sure they work before making larger jumps ahead.

Photo of Me
Photo of Me
Photo of Me
Photo of Me
FULLSCREEN
You can see my code here
there's nothing left here, you might as well leave