Παρουσίαση με θέμα: "Semantic Drift between the Testaments Using Collocation Analysis to Find Theological Significance Matt Munson."— Μεταγράφημα παρουσίασης:
Semantic Drift between the Testaments Using Collocation Analysis to Find Theological Significance Matt Munson
Theological Background Use of Old Testament in the New – Similarities – Differences – Relative Meaning But re-use goes beyond quotations – What about the similarities, differences, and relative meanings of individual words – Can we detect theological significance even here?
Linguistic Background - Collocations Firth, “You shall know a word by the company it keeps!” Harris, “If we consider words or morphemes A and B to be more different in meaning than A and C, then we will often find that the distributions of A and B are more different than the distributions of A and C. In other words, difference of meaning correlates with difference of distribution.”
My Hypothesis Linguistic: – By comparing the collocation word fields of the same target word in the Septuagint and the New Testament, one can detect which words have changed in meaning the most from one Testament to the other. Theological: – Further investigation of how the collocation fields have changed will lead to insights concerning the theological changes from the LXX to the NT.
My Method Lemmatized Greek Texts Collocation span of 4L and 4R Co-occurrence counts Log likelihood significance Cosine similarity of log likelihood tables Comparison of log likelihood and cosine similarity tables for each lemma
Lemmatized Greek Texts Highly Inflected Language – Nouns: 8 distinct forms – Verbs: would you believe over 200 forms? Not lemmatizing would make each of these forms appear to the computer to be a unique word Could be interesting but not enough data to overcome atomization
Collocation Span of 4L and 4R Experiments have shown this to be the most effective span L4L3L2L1LemmaR1R2R3R4 ἐνἀρχήποιέωὁθεόςὁοὐρανόςκαίὁ ὁἄβυσσοςκαίπνεῦμαθεόςἐπιφέρωἐπάνωὁὕδωρ
Co-occurrence Counts Simple counts of how often a collocate occurs in the given span of the target Example: – 'ἔντιμος' 1, 'ἀπόδεκτος' 1, προευαγγελίζομαι' 1, 'γεννάω' 11, 'κιθάρα' 1, 'ὀλίγος' 1, 'πρό' 4, 'ἀνοίγω' 2, 'ἐπιποθέω' 2, 'ἀστεῖος' 1, 'ἔμπροσθεν' 6, 'μετάνοια' 7, 'ἐκπορεύομαι' 2, 'ὅτε' 9, 'οἰκτιρμός' 2, 'Ῥαιφάν' 1, 'ὅτι' 122…
Log Likelihood Significance I “Significant collocation is regular collocation between two items, such that they co-occur more often than their respective frequencies.” (Léon, 14) “log-likelihood measures the strength of association between words by comparing the occurrences of words respectively and their occurrences together.” also appropriate for sparse data This measures syntagmatic relationships More Information: TU Darmstadt LinguisticsWebTU Darmstadt LinguisticsWeb
Cosine Similarity of Log-Likelihood Tables Cosine similarity is often used to measure the similarity between word frequency lists I used it to compare log likelihood tables, which have the same form as frequency lists I compared all the tables in the LXX to each other and all in the NT to each other I also compared the same lemmata in each Testament to each other This measures paradigmatic relationships
Compare Log Likelihood Tables This will show which collocates occur more significantly with the lemma in the LXX and the NT Positive means more significantly in the LXX, negative in the NT Syntagmatic comparison Will show change in usage but not change in meaning directly
Compare Cosine Similarity Tables This will show to which other lemmata each lemma in each Testament attracts The value will be positive if it they are more attracted in the LXX, negative if in the NT Paradigmatic comparison These comparisons should suggest meaning change
Compare Cosine Similarity Tables - Results 'θεόςEnglishLemma συνίημιTo come together, understand0,87504707 δοξάζωTo magnify, extol0,8714031 ταπεινόωTo lower, to abase0,83253347 φέρωTo carry0,81786621 σπέρμαSeed0,81385463 ἀρχήBeginning, power0,80058442 ἐπερωτάωTo consult0,80056964 ΔαυίδDavid0,78433087 πρεσβύτεροςElder0,7824249 γλῶσσαTongue0,77929021 … καιρόςPeace-0,55061329 ἕτεροςOther-0,56250249 οὖνAnd so-0,58666004 ἐμόςMine (possessive)-0,58994683 ἵναIn order to-0,62106431 βάλλωTo throw-0,63303536 ὥραPart of a day, hour-0,64563456 μᾶλλονmore-0,68514535 χάριςGift, grace-0,76401283 περιπατέωTo walk (about), to live-0,97335415
Next Steps Finish comparison of LL and CS tables Include other information in analysis – POS Information – Semantic dependencies Could help to account for Greek sentence structure Remove information from the analysis – Stop words – Certain parts of speech (e.g., adverbs, particles) Close-reading analysis of the results