syntaxfree 1 Posted January 3, 2013 Share Posted January 3, 2013 Use case. Very specific, but it's an instance of a larger problem. I have an IFFT recipe that sends my Foursquare check-ins to Evernote. This is useful both because I take pictures with Foursquare -- it ends up replacing Evernote Food for all but the fanciest, most expensive meals (which deserve memorializing on their own) -- and because I can search for (counterfactual example) "Burger King" and see how many times I've been to Burger King in a date range. What search can't do is differentiate between Burger King instances so they can be plotted on the Atlas. Right now the Foursquare note doesn't have any textual information re: location, just a map image, but it has a link to a 4sq.com page with the full information. (Fiddling with IFFT doesn't seem to fix this). In all likelihood it's unfeasible to just crawl links for Atlas info, so I don't have a solution. Maybe talk to Foursquare or IFFT. This is one specific use case, but there are others. Atlas should pick up on addresses in plaintext in general and metadata from images. It's a loosey-fuzzy problem in implementation space (so many syntaxes to parse), but Evernote excels at loosey-fuzzy problems elsewhere -- the related notes in Clearly are a spine-chilling sign of the impending kurzweillian singularity, and I work with machine learning for a $nondisclosure_agreement startup, I know the pain. The more general problem is that note generation location does not reflect necessarily content generation location. I hardly generate notes on the go, and when I do, it's because I'm using Evernote as a makeshift word processor in a pinch -- the location of where I finish writing isn't relevant because it's just where the train happened to be when I finished the technical note it took me 40 minutes to write. Another issue is content generated externally. I'm more likely to use the phone built-in camera and a voice-activated recorder app and then push those things Evernote -- the iPhone app doesn't handle corner cases gracefully, and I can't take the 3% chance, particularly with two hours of audio to be reviewed later. (And I'll still prefer voice-activated recording in the particular case).(Crash course on crashing soft: even if the VAR app crashes, which it actually never, ever does, it'd leave behind much of the audio chunks given that silences cause partial files to be generated. Another aggravating problem I had just today: when Evernote is photographing a document, but can't communicate to the image processing server, it just drains your battery off and gives up when you switch apps; in Rio the 3G coverage is a couple of orders of magnitude better than the average US city, but I was in a rough spot. Document photographing should crash softly: push a normal image note to evernote, try to contact the image processing server and leave the unprocessed note as a todo item)Back to the point: pushing more functionality into Evernote is cool and improving them for corner cases is even better, but you can't cover all the media generation one does in a phone, nor can you get a cat scalded by boiling milk to try to drink it again. It does not follow that note creation place reflects content creation place. Atlas looks extremely useful at face value -- I have a very "located" memory and workflow, I remember things by remembering context, yet what I Atlas knows right now my workplace, my home and one random location from long text writing in a moving car -- all while much note-type content is being generated in Foursquare and IFFT-piped into my green external brain, many times with location data embedded in image metadata.TL;DR: I love the idea of Atlas but it's nonfunctional right now. Link to comment
This topic is now archived and is closed to further replies.