COMPARATIVE STUDY OF ANNOTATION TOOLS AND TECHNIQUES
Need help with a related project topic or New topic? Send Us Your Topic
DOWNLOAD THE COMPLETE PROJECT MATERIAL
COMPARATIVE STUDY OF ANNOTATION TOOLS AND TECHNIQUES
Chapter one
Introduction:
1.1 Research and Background of the study
In recent years, big data has emerged as a prominent study area in the IT sector. Because so many gadgets are connected to the internet, a massive amount of data is generated every day.
The vast amount of data comes from a variety of sources, including the World Wide Web, ecommerce, and social media platforms such as Facebook and Twitter. Big data may be classified into two types: structured data and unstructured data.
Structured data is the typical data that can be kept in a relational database; querying these data is simple, and analysis can be performed easily (Kiefer, 2016).
Unstructured data is typically derived from email, images, documents, video files, audio, and other sources. It is difficult to process unstructured data with relational databases, therefore handling such data is a challenge.
This study will look into annotation approaches; many annotation tools and techniques have been presented to aid with big data analysis.
Annotation has played an important part in a variety of industries (education, health, commerce, etc.); for example, in machine learning, an annotation tool can be used to train data sets.
Annotation’s definition varies based on the context in which it is used. Okunoye, Oladejo, and Odumuyiwa (2010) define annotation as a method of analysing a document.
Annotation can take several forms, including identifying an object, leaving a comment, tagging photographs, sounds, and videos, and so on. Annotating a document makes it more thorough, informative, and easily queryable, which adds value to the document.
2
Okunoye et al. (2010) describe two types of annotation: implicit and explicit. Implicit annotations are those that are supposed to be understood only by the maker. Unlike implicit annotation, explicit annotation assumes that the meaning of the annotation is known by a group, team, or users within the same field of study (Okunoye et al., 2010).
Annotation as object is described as a purposeful and topical value-adding note associated with an existing information object (Bodain & Robert, 2007).
An annotation is described as “any object (annotation) that is associated with another object (document) by some relationship” (Brusilovsky, 2005). (Brusilovsky, 2005) defines annotation as both an object and an action that involves anchoring the object to the relevant document.
Annotation is described as the act of interpreting a document (Robert, 2007). It is the act of establishing an annotation as an object and anchoring it to the document object (that is, the information source being annotated).
Annotation is also defined as the process of adding additional information (metadata) to a database record in order to provide a better understanding and connection to related information.
Annotation can be done manually, semi-automatically, or automatically.
• Automatic annotation: the use of computerised automated tools to annotate documents.
• Semi-automatic: uses computerised automated technologies to annotate a document but still requires human interaction.
• Manual annotation: The annotation is entirely completed by a human annotator.
3
1.2 Statement of the Problem
Unstructured data can provide some helpful information when handled and evaluated. Many annotation tools and techniques can be used to structure unstructured data;
however, because there are so many annotation tools and techniques available, it may be difficult to select a specific tool that will be appropriate for a given data or operating system.
There is a need to research and assess various annotation tools and approaches to determine their usefulness, usability, strength, and the type of data they are most suited for.
This thesis provides a comparative examination of several annotation tools and approaches to aid in deciding which tool to use when confronted with a problem that can be solved through annotation.
1.3 Research aims and objectives
1.3.1 Aim
A complete review of existing annotation proposals (theses, publications, and software). The end result should be a table that compares existing methodologies and tools. The tools discovered need to be tested.
1.3.2 Objective.
• Investigate and evaluate various existing annotation tools and methodologies.
• Put the tools and techniques you discovered to the test.
• Evaluate annotation tools and techniques. • Provide recommendations for the best tools for particular contexts of use.
4
1.4 Limitations of the Study
This study focuses on comparing a few available annotation tools and methodologies for processing unstructured data.
1.5 Research Outline
The research is organised into five chapters. Each chapter highlighted many issues and subtopics, as follows:
Chapter 2 covers the fundamentals of annotation tools and techniques, as well as a survey of the literature.
Chapter 3 investigates, examines, and compares various annotation tools. Chapter 4 contains the comparison results and commentary. Finally, Chapter 5 includes a summary, conclusion, recommendations, and future work.
DOWNLOAD MATERIAL
Need help with a related project topic or New topic? Send Us Your Topic
DOWNLOAD THE COMPLETE PROJECT MATERIAL