There was a gap in my understanding of how publicly linked data is used and maintained in an application, especially if the data being used needs to be dynamic. I understand that there are public API's that can be utilized in order to perform such tasks, but did not have a full understanding as to how. Also, another thing that was unclear to me was how to keep the data linked that is obtained via API's by applying what we have learned thus far about linked data. Would this require more manual labor to tag the entire data set? I decided to try to looking for examples or tutorials that would help me better understand these concepts. Through my research I have learned that linked data has been underused in the past, because it relied on manual tagging of the data and the tagging of the data was not consistent. This made it time-consuming to use and required screen scraping of individual sites to obtain or manually link the data. There are now services that provide found linked data that act as a repository for data sets. That also brings up more questions, why do we not have one source that all data is being linked to? Does this duplication of data cause gaps between the connections between linked data? These are questions I intend to ask Dr. Bansal for clarification on.
JSON LD is a method to serialize and transfer linked data. This talk really peaked my interest on why someone would use JSON LD vs other methods of transferring data. I found a site dedicated to explaining what JSON LD is and how to use it. The site contained many useful tutorials explaining everything from linked data basics, the issues we face with linked data, and how JSON-LD aims to resolve some of those issues. JSON-LD's main purpose is to resolve the ambiguity among naming schemes from our data sources by giving the data a context - mainly when obtaining data via JSON.
My goal for next week to try creating an API based web application.