In preparation for a webinar that I am hosting on Thursday I tested a new app that is supposed to help users identify trees. The app is called FindATree. The concept behind the app is solid, but the execution is lacking. The app has you answer a few questions about the characteristics of the tree that you see in front of you. Based on those responses the app tells you what you’re seeing. Except in my case it didn’t identify the correct tree. The app repeatedly told me that I was looking at a red cedar tree when I knew I was looking at an eastern hemlock.
The FindATree app lacked a few components that could make it better. First, more detailed questions should be asked before stating a result. Second, a bigger database of images of trees is needed for users to compare what to what they’re seeing in real life. And third, an augmented reality component would make it possible to capture a picture of a tree to compare to a database. While it would take a long time build an app that includes every possible tree and variation of tree, students could build their own regionally-based app through the services of either the MIT App Inventor or Metaverse.
To be clear, I didn’t write this post to bash FindATree. I wrote it to share the idea that I got from testing an app that didn’t work as I thought it would.