For example, they argued that the principles of Euclidean geometry were developed using human reason, and were not the result of supernatural revelation or sensory experience.
In our closing remarks we will take a broader view of NLP, including its foundations and the further directions you might want to explore.
Some of the topics are not well-supported by NLTK, and you might like to rectify that problem by contributing new software and data to the toolkit.
Each applies distinct methodologies to gather observations, develop theories and test hypotheses.
All serve to deepen our understanding of language and of the intellect that is manifested in language.
You should now be equipped to work with large datasets, to create robust models of linguistic phenomena, and to extend them into components for practical language technologies.
We hope that the Natural Language Toolkit (NLTK) has served to open up the exciting endeavor of practical natural language processing to a broader audience than before.
In NLP this issue surfaces in debates about the priority of corpus data versus linguistic introspection in the construction of computational models.
A further concern, enshrined in the debate between realism and idealism, was the metaphysical status of the constructs of a theory.
This principle provided a useful correspondence between syntax and semantics, namely that the meaning of a complex expression could be computed recursively. The approaches just outlined share the premise that computing with natural language crucially relies on rules for manipulating symbolic representations.