Psychometric theory provides a natural foundation to analyze text. The rapid increase in the availability of text as data has contributed to its growing use when measuring psychological and educational constructs. With that in mind, the validity of scores from text needs to be evaluated in order to make sound scientific claims. This dissertation focuses on how one would evaluate the psychometric properties of scores from text.A common approach to analyze text is to use topic modeling. This dissertation will focus on a specific topic model, Latent Dirichlet Allocation (LDA). LDA is topic model that can summarize a group of documents with a set of topics. A major advantage of LDA is that it can quickly score documents using a data driven approach. This dissertation reframes LDA also as a measurement model, where one can rigorously evaluate the scores from text.Chapter 1 introduces relevant text mining techniques with an emphasis on LDA. An applied example will be provided where text responses are gathered from students taking a statistics course. In Chapter 2, informative priors will be investigated with LDA. Chapter 3 proposes new methods to evaluate the reliability of scores from LDA. Chapter 4 proposes a new model that can be used to jointly estimate models for both self report and text. Finally, Chapter 5 will close with concluding thoughts and potential future directions.