Bibliography for Class 9 of English 238 (Fall 2021)
Digital Humanities: Introduction to the Field
The following is the part of the English 228 bibliography relevant to this class. (Also see: cumulative course bibliography.)
4390815
Class9
chicago-fullnote-bibliography
50
creator
asc
1
1
1
7607
https://alanyliu.org/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3A%22zotpress-a4498b802f1db2b2acf7a91e6471838f%22%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22UCIL4JIV%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ajayi%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EAjayi%2C%20Demi.%20%26%23x201C%3BHow%20BERT%20and%20GPT%20Models%20Change%20the%20Game%20for%20NLP.%26%23x201D%3B%20%3Ci%3EWatson%20Blog%3C%5C%2Fi%3E%20%28blog%29%2C%202020.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fwww.ibm.com%5C%2Fblogs%5C%2Fwatson%5C%2F2020%5C%2F12%5C%2Fhow-bert-and-gpt-models-change-the-game-for-nlp%5C%2F%27%3Ehttps%3A%5C%2F%5C%2Fwww.ibm.com%5C%2Fblogs%5C%2Fwatson%5C%2F2020%5C%2F12%5C%2Fhow-bert-and-gpt-models-change-the-game-for-nlp%5C%2F%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DUCIL4JIV%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22blogPost%22%2C%22title%22%3A%22How%20BERT%20and%20GPT%20models%20change%20the%20game%20for%20NLP%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Demi%22%2C%22lastName%22%3A%22Ajayi%22%7D%5D%2C%22abstractNote%22%3A%22Our%20NLP%20series%20blog%20discusses%20the%20BERT%20and%20GPT%20models%3A%20what%20makes%20these%20models%20so%20powerful%20and%20how%20they%20can%20benefit%20your%20business.%22%2C%22blogTitle%22%3A%22Watson%20Blog%22%2C%22date%22%3A%222020%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.ibm.com%5C%2Fblogs%5C%2Fwatson%5C%2F2020%5C%2F12%5C%2Fhow-bert-and-gpt-models-change-the-game-for-nlp%5C%2F%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A26%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22IH6DTD62%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Bender%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBender%2C%20Emily%20M.%2C%20Timnit%20Gebru%2C%20Angelina%20McMillan-Major%2C%20and%20Shmargaret%20Shmitchell.%20%26%23x201C%3BOn%20the%20Dangers%20of%20Stochastic%20Parrots%3A%20Can%20Language%20Models%20Be%20Too%20Big%3F%26%23x201D%3B%20In%20%3Ci%3EProceedings%20of%20the%202021%20ACM%20Conference%20on%20Fairness%2C%20Accountability%2C%20and%20Transparency%3C%5C%2Fi%3E%2C%20610%26%23x2013%3B23.%20FAccT%20%26%23x2019%3B21.%20Virtual%20Event%2C%20Canada%3A%20Association%20for%20Computing%20Machinery%2C%202021.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3442188.3445922%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3442188.3445922%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DIH6DTD62%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22On%20the%20Dangers%20of%20Stochastic%20Parrots%3A%20Can%20Language%20Models%20Be%20Too%20Big%3F%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emily%20M.%22%2C%22lastName%22%3A%22Bender%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Timnit%22%2C%22lastName%22%3A%22Gebru%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Angelina%22%2C%22lastName%22%3A%22McMillan-Major%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shmargaret%22%2C%22lastName%22%3A%22Shmitchell%22%7D%5D%2C%22abstractNote%22%3A%22The%20past%203%20years%20of%20work%20in%20NLP%20have%20been%20characterized%20by%20the%20development%20and%20deployment%20of%20ever%20larger%20language%20models%2C%20especially%20for%20English.%20BERT%2C%20its%20variants%2C%20GPT-2%5C%2F3%2C%20and%20others%2C%20most%20recently%20Switch-C%2C%20have%20pushed%20the%20boundaries%20of%20the%20possible%20both%20through%20architectural%20innovations%20and%20through%20sheer%20size.%20Using%20these%20pretrained%20models%20and%20the%20methodology%20of%20fine-tuning%20them%20for%20specific%20tasks%2C%20researchers%20have%20extended%20the%20state%20of%20the%20art%20on%20a%20wide%20array%20of%20tasks%20as%20measured%20by%20leaderboards%20on%20specific%20benchmarks%20for%20English.%20In%20this%20paper%2C%20we%20take%20a%20step%20back%20and%20ask%3A%20How%20big%20is%20too%20big%3F%20What%20are%20the%20possible%20risks%20associated%20with%20this%20technology%20and%20what%20paths%20are%20available%20for%20mitigating%20those%20risks%3F%20We%20provide%20recommendations%20including%20weighing%20the%20environmental%20and%20financial%20costs%20first%2C%20investing%20resources%20into%20curating%20and%20carefully%20documenting%20datasets%20rather%20than%20ingesting%20everything%20on%20the%20web%2C%20carrying%20out%20pre-development%20exercises%20evaluating%20how%20the%20planned%20approach%20fits%20into%20research%20and%20development%20goals%20and%20supports%20stakeholder%20values%2C%20and%20encouraging%20research%20directions%20beyond%20ever%20larger%20language%20models.%22%2C%22date%22%3A%222021%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202021%20ACM%20Conference%20on%20Fairness%2C%20Accountability%2C%20and%20Transparency%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3442188.3445922%22%2C%22ISBN%22%3A%22978-1-4503-8309-7%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3442188.3445922%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A28%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22Z5VBV4WR%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Heuser%22%2C%22parsedDate%22%3A%222017%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHeuser%2C%20Ryan.%20%26%23x201C%3BWord%20Vectors%20in%20the%20Eighteenth%20Century.%26%23x201D%3B%20In%20%3Ci%3EDH%202017%3C%5C%2Fi%3E.%20Montreal%3A%20McGill%20University%2C%20Universit%26%23xE9%3B%20de%20Montr%26%23xE9%3Bal%2C%20Alliance%20of%20Digital%20Humanities%20Organizations%20%28ADHO%29%2C%202017.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdh2017.adho.org%5C%2Fabstracts%5C%2F582%5C%2F582.pdf%27%3Ehttps%3A%5C%2F%5C%2Fdh2017.adho.org%5C%2Fabstracts%5C%2F582%5C%2F582.pdf%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DZ5VBV4WR%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Word%20Vectors%20in%20the%20Eighteenth%20Century%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ryan%22%2C%22lastName%22%3A%22Heuser%22%7D%5D%2C%22abstractNote%22%3A%22This%20talk%20explores%20how%20new%20vector-based%20approaches%20to%20computational%20semantics%20both%20afford%20new%20methods%20to%20digital%20humanities%20research%2C%20and%20raise%20interesting%20questions%20for%20eighteenth-century%20literary%20studies%20in%20particular.%22%2C%22date%22%3A%222017%22%2C%22proceedingsTitle%22%3A%22DH%202017%22%2C%22conferenceName%22%3A%22DH%202017%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdh2017.adho.org%5C%2Fabstracts%5C%2F582%5C%2F582.pdf%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-29T22%3A51%3A46Z%22%7D%7D%2C%7B%22key%22%3A%2276SATRVJ%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hua%20and%20Raley%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHua%2C%20Minh%2C%20and%20Rita%20Raley.%20%26%23x201C%3BPlaying%20With%20Unicorns%3A%20AI%20Dungeon%20and%20Citizen%20NLP.%26%23x201D%3B%20%3Ci%3EDigital%20Humanities%20Quarterly%3C%5C%2Fi%3E%20014%2C%20no.%204%20%282020%29.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3D76SATRVJ%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Playing%20With%20Unicorns%3A%20AI%20Dungeon%20and%20Citizen%20NLP%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minh%22%2C%22lastName%22%3A%22Hua%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rita%22%2C%22lastName%22%3A%22Raley%22%7D%5D%2C%22abstractNote%22%3A%22AI%20Dungeon%202%20is%20an%20indie%20text%20adventure%20game%20that%20caught%20traction%20within%20the%20gaming%20and%20hobbyist%20machine%20learning%20communities%20for%20its%20promise%20of%20%5Cu201cinfinite%5Cu201d%20customizable%20adventures%2C%20which%20are%20generated%20and%20narrated%20by%20GPT-2%2C%20OpenAI%5Cu2019s%201.5%20billion%20parameter%20language%20model.%20Samples%20of%20gameplay%20illustrate%20AID%5Cu2019s%20remarkable%20linguistic%20competence%20and%20domain%20knowledge%2C%20as%20well%20as%20its%20capacity%20for%20what%20can%20only%20be%20described%20as%20wackiness.%20More%20striking%20are%20AID%5Cu2019s%20innovative%20gameplay%20mechanics%2C%20which%20reimagine%20how%20we%20interact%20with%20large%20language%20models.%20Game%20play%20entails%20a%20procedural%20and%20incremental%20process%20of%20engaging%20with%20GPT-2%20that%20opens%20up%20the%20possibility%20of%20developing%20a%20holistic%20and%20interdisciplinary%20framework%20for%20meaningful%20qualitative%20evaluation%20of%20language%20models%20that%20does%20not%20have%20commercial%20use%20as%20its%20necessary%20endgame.%20With%20respect%20to%20both%20evaluation%20and%20writing%20itself%2C%20AID%20situates%20human%20players%20inextricably%20%5Cu201cin%20the%20loop%5Cu201d%20as%20necessary%20partners%20with%20autonomous%20systems.%20Our%20article%20thus%20reads%20AID%20both%20as%20an%20example%20of%20current%20hobbyist%20relations%20with%20machine%20learning%20and%20as%20a%20responsible%20model%20for%20future%20human-AI%20collaborative%20creative%20practices.%22%2C%22date%22%3A%222020%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%221938-4122%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A27%3A49Z%22%7D%7D%2C%7B%22key%22%3A%227IE35C35%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22King%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKing%2C%20Adam%20Daniel.%20%26%23x201C%3BInferKit%20Demo%2C%26%23x201D%3B%202020.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fapp.inferkit.com%5C%2Fdemo%27%3Ehttps%3A%5C%2F%5C%2Fapp.inferkit.com%5C%2Fdemo%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3D7IE35C35%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22InferKit%20Demo%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Adam%20Daniel%22%2C%22lastName%22%3A%22King%22%7D%5D%2C%22abstractNote%22%3A%22%5BFrom%20description%20of%20%5C%22Inferkit%3A%20Text%20Generation%5C%22%20at%20https%3A%5C%2F%5C%2Ffabricofdigitallife.com%5C%2Findex.php%5C%2FDetail%5C%2Fobjects%5C%2F4688%5D%3A%20%5Cn%5CnInferKit%20was%20created%20by%20Adam%20Daniel%20King%20as%20an%20upgraded%20replacement%20version%20of%20TalkToTransformer%2C%20which%20he%20could%20no%20longer%20afford%20to%20support%20due%20to%20high%20traffic.%20Both%20use%20OpenAI%27s%20GPT-2%20technology.%20%5Cn%5Cn%5C%22InferKit%27s%20text%20generation%20tool%20creates%20continuations%20of%20any%20text%20you%20give%20it%2C%20using%20a%20state-of-the-art%20neural%20network.%20It%27s%20configurable%20and%20can%20produce%20any%20length%20of%20text%20on%20practically%20any%20topic.%20You%20can%20also%20create%20custom%20generators%20for%20specific%20kinds%20of%20content.%5C%22%22%2C%22date%22%3A%222020%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fapp.inferkit.com%5C%2Fdemo%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A26%3A38Z%22%7D%7D%2C%7B%22key%22%3A%22RGFJSS8E%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Moore%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EMoore%2C%20Samuel%20K.%2C%20David%20Schneider%2C%20and%20Eliza%20Strickland.%20%26%23x201C%3BHow%20Deep%20Learning%20Works.%26%23x201D%3B%20%3Ci%3EIEEE%20Spectrum%3C%5C%2Fi%3E%2C%202021.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fspectrum.ieee.org%5C%2Fwhat-is-deep-learning%5C%2Fneural-network%27%3Ehttps%3A%5C%2F%5C%2Fspectrum.ieee.org%5C%2Fwhat-is-deep-learning%5C%2Fneural-network%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DRGFJSS8E%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22magazineArticle%22%2C%22title%22%3A%22How%20Deep%20Learning%20Works%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Samuel%20K.%22%2C%22lastName%22%3A%22Moore%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Schneider%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eliza%22%2C%22lastName%22%3A%22Strickland%22%7D%5D%2C%22abstractNote%22%3A%22Today%27s%20boom%20in%20AI%20is%20centered%20around%20a%20technique%20called%20deep%20learning%2C%20which%20is%20powered%20by%20artificial%20neural%20networks.%20Here%27s%20a%20graphical%20explanation%20of%20how%20these%20neural%20networks%20are%20structured%20and%20trained.%22%2C%22date%22%3A%222021%22%2C%22language%22%3A%22en%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fspectrum.ieee.org%5C%2Fwhat-is-deep-learning%5C%2Fneural-network%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-09-29T19%3A25%3A30Z%22%7D%7D%2C%7B%22key%22%3A%22NJP3VAJX%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nicholson%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENicholson%2C%20Chris.%20%26%23x201C%3BA%20Beginner%26%23x2019%3Bs%20Guide%20to%20Generative%20Adversarial%20Networks%20%28GANs%29.%26%23x201D%3B%20Pathmind%2C%202020.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fgenerative-adversarial-network-gan%27%3Ehttp%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fgenerative-adversarial-network-gan%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DNJP3VAJX%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22A%20Beginner%27s%20Guide%20to%20Generative%20Adversarial%20Networks%20%28GANs%29%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chris%22%2C%22lastName%22%3A%22Nicholson%22%7D%5D%2C%22abstractNote%22%3A%22Generative%20adversarial%20networks%20%28GANs%29%20are%20deep%20neural%20net%20architectures%20comprised%20of%20two%20nets%2C%20pitting%20one%20against%20the%20other.%22%2C%22date%22%3A%222020%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fgenerative-adversarial-network-gan%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-01-11T08%3A41%3A54Z%22%7D%7D%2C%7B%22key%22%3A%22VSXZ9BW8%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nicholson%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENicholson%2C%20Chris.%20%26%23x201C%3BA%20Beginner%26%23x2019%3Bs%20Guide%20to%20Neural%20Networks%20and%20Deep%20Learning.%26%23x201D%3B%20Pathmind%2C%202020.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fneural-network%27%3Ehttp%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fneural-network%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DVSXZ9BW8%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22A%20Beginner%27s%20Guide%20to%20Neural%20Networks%20and%20Deep%20Learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chris%22%2C%22lastName%22%3A%22Nicholson%22%7D%5D%2C%22abstractNote%22%3A%22An%20introduction%20to%20deep%20artificial%20neural%20networks%20and%20deep%20learning.%22%2C%22date%22%3A%222020%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwiki.pathmind.com%5C%2Fneural-network%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222023-01-11T08%3A41%3A58Z%22%7D%7D%2C%7B%22key%22%3A%223MRGFJTD%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Offert%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EOffert%2C%20Fabian.%20%26%23x201C%3BIntuition%20and%20Epistemology%20of%20High-Dimensional%20Vector%20Space.%26%23x201D%3B%20Fabian%20Offert%2C%202021.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fzentralwerkstatt.org%5C%2Fblog%5C%2Fvsm%27%3Ehttps%3A%5C%2F%5C%2Fzentralwerkstatt.org%5C%2Fblog%5C%2Fvsm%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3D3MRGFJTD%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22webpage%22%2C%22title%22%3A%22Intuition%20and%20Epistemology%20of%20High-Dimensional%20Vector%20Space%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Offert%22%7D%5D%2C%22abstractNote%22%3A%22%5BSecond%20paragraph%3A%5D%20In%20this%20post%20I%20will%20try%20to%20trace%20the%20way%20in%20which%20vector%20space%20models%20generate%20knowledge%2C%20for%20the%20particular%20case%20of%20the%20digital%20humanities.%20I%20will%20try%20to%20answer%20the%20question%3A%20what%20is%20the%20price%20of%20the%20commensurability%20that%20vector%20space%20models%20provide%3F%20In%20other%20words%3A%20if%20we%20use%20a%20vector%20space%20model%20to%20compare%20two%20or%20more%20complex%20aesthetic%20objects%2C%20what%20are%20the%20implicit%20epistemological%20assumptions%20enabling%20the%20commensurability%20of%20these%20objects%3F%22%2C%22date%22%3A%222021%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fzentralwerkstatt.org%5C%2Fblog%5C%2Fvsm%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A20%3A33Z%22%7D%7D%2C%7B%22key%22%3A%22IH4RLCGW%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Offert%20and%20Bell%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EOffert%2C%20Fabian%2C%20and%20Peter%20Bell.%20%26%23x201C%3BGenerative%20Digital%20Humanities.%26%23x201D%3B%20In%20%3Ci%3ECEUR%20Workshop%20Proceedings%3C%5C%2Fi%3E%2C%20202%26%23x2013%3B12.%20Amsterdam%3A%20CEUR-WS.org%2C%202020.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-2723%5C%2Fshort23.pdf%27%3Ehttp%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-2723%5C%2Fshort23.pdf%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DIH4RLCGW%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Generative%20Digital%20Humanities%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Fabian%22%2C%22lastName%22%3A%22Offert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Bell%22%7D%5D%2C%22abstractNote%22%3A%22While%20generative%20machine%20learning%20has%20recently%20attracted%20a%20significant%20amount%20of%20attention%20in%20the%20computer%20science%20community%2C%20its%20potential%20for%20the%20digital%20humanities%20has%20so%20far%20not%20been%20fully%20evaluated.%20In%20this%20paper%2C%20we%20examine%20generative%20adversarial%20networks%2C%20a%20state-of-the%20art%20generative%20machine%20learning%20technique.%20We%20argue%20that%20GANs%20can%20be%20particularly%20useful%20in%20digital%20art%20history%2C%20where%20they%20can%20be%20employed%20to%20facilitate%20the%20exploration%20of%20the%20semantic%20structure%20of%20large%20image%20corpora.%20Moreover%2C%20we%20posit%20that%20the%20foundational%20statistical%20distinction%20between%20discriminative%20and%20generative%20approaches%20offers%20an%20alternative%20critical%20perspective%20on%20machine%20learning%20in%20the%20digital%20humanities%20context.%20If%20%5Cu201call%20models%20are%20wrong%2C%20some%20are%20useful%5Cu201d%2C%20as%20the%20often-cited%20passage%20reads%2C%20we%20argue%20that%2C%20in%20case%20of%20the%20digital%20humanities%2C%20the%20most%20useful-wrong%20models%20are%20generative.%22%2C%22date%22%3A%222020%22%2C%22proceedingsTitle%22%3A%22CEUR%20Workshop%20Proceedings%22%2C%22conferenceName%22%3A%22CHR%202020%3A%20Workshop%20on%20Computational%20Humanities%20Research%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-2723%5C%2Fshort23.pdf%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A10%3A50Z%22%7D%7D%2C%7B%22key%22%3A%224CQ7CRKA%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schmidt%22%2C%22parsedDate%22%3A%222015%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESchmidt%2C%20Benjamin.%20%26%23x201C%3BVector%20Space%20Models%20for%20the%20Digital%20Humanities.%26%23x201D%3B%20%3Ci%3EBookworm%3C%5C%2Fi%3E%20%28blog%29%2C%202015.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fbookworm.benschmidt.org%5C%2Fposts%5C%2F2015-10-25-Word-Embeddings.html%27%3Ehttp%3A%5C%2F%5C%2Fbookworm.benschmidt.org%5C%2Fposts%5C%2F2015-10-25-Word-Embeddings.html%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3D4CQ7CRKA%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22blogPost%22%2C%22title%22%3A%22Vector%20Space%20Models%20for%20the%20Digital%20Humanities%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benjamin%22%2C%22lastName%22%3A%22Schmidt%22%7D%5D%2C%22abstractNote%22%3A%22%5BBegnning%3A%5D%20Recent%20advances%20in%20vector-space%20representations%20of%20vocabularies%20have%20created%20an%20extremely%20interesting%20set%20of%20opportunities%20for%20digital%20humanists.%20These%20models%2C%20known%20collectively%20as%20word%20embedding%20models%2C%20may%20hold%20nearly%20as%20many%20possibilities%20for%20digital%20humanitists%20modeling%20texts%20as%20do%20topic%20models.%20Yet%20although%20they%5Cu2019re%20gaining%20some%20headway%2C%20they%20remain%20far%20less%20used%20than%20other%20methods%20%28such%20as%20modeling%20a%20text%20as%20a%20network%20of%20words%20based%20on%20co-occurrence%29%20that%20have%20considerably%20less%20flexibility.%22%2C%22blogTitle%22%3A%22Bookworm%22%2C%22date%22%3A%222015%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fbookworm.benschmidt.org%5C%2Fposts%5C%2F2015-10-25-Word-Embeddings.html%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A18%3A08Z%22%7D%7D%2C%7B%22key%22%3A%22WMFWIPM5%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schmitt%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESchmitt%2C%20Phillipp.%20%3Ci%3EI%20Am%20Sitting%20In%20A%20High-Dimensional%20Room%3C%5C%2Fi%3E.%202020.%202-channel%20audio%20simulation%2C%2022%3A16%20min.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwork%5C%2Fi-am-sitting-in-a-high-dimensional-room%27%3Ehttps%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwork%5C%2Fi-am-sitting-in-a-high-dimensional-room%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DWMFWIPM5%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22artwork%22%2C%22title%22%3A%22I%20Am%20Sitting%20In%20A%20High-Dimensional%20Room%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22artist%22%2C%22firstName%22%3A%22Phillipp%22%2C%22lastName%22%3A%22Schmitt%22%7D%5D%2C%22abstractNote%22%3A%22The%20sound%20simulation%20%5Cu201cI%20Am%20Sitting%20In%20A%20High-Dimensional%20Room%5Cu201d%20takes%20the%20listener%20into%20a%20high-%20dimensional%20space%20extending%20in%20many%20more%20dimensions%20than%20the%20three%20you%20are%20in%20now.%22%2C%22artworkMedium%22%3A%222-channel%20audio%20simulation%22%2C%22artworkSize%22%3A%2222%3A16%20min%22%2C%22date%22%3A%222020%22%2C%22language%22%3A%22en%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwork%5C%2Fi-am-sitting-in-a-high-dimensional-room%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A22%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22RH9UDUFA%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schmitt%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESchmitt%2C%20Phillipp.%20%26%23x201C%3BOn%20Being%20a%20Vector%20inside%20a%20Neural%20Network%2C%26%23x201D%3B%202019.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwriting%5C%2Fon-being-a-vector-inside-a-neural-network%27%3Ehttps%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwriting%5C%2Fon-being-a-vector-inside-a-neural-network%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DRH9UDUFA%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22blogPost%22%2C%22title%22%3A%22On%20being%20a%20vector%20inside%20a%20neural%20network%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Phillipp%22%2C%22lastName%22%3A%22Schmitt%22%7D%5D%2C%22abstractNote%22%3A%22%5BFirst%20paragraph%3A%5D%20I%20can%20tell%20you%20precisely%20where%20I%20am%20but%20I%20could%20never%20show%20you%20on%20a%20map%20or%20draw%20you%20a%20picture.%20I%20can%20imagine%20the%20space%2C%20vividly%20even%2C%20but%20no%20clear%20image%20comes%20to%20mind.%20The%20space%20I%20am%20in%20is%20unseeable%20because%20it%20extends%20in%20many%20dimensions.%20Not%20three%2C%20not%20four%2C%20not%20even%20eleven.%20I%20do%20not%20generally%20like%20trying%20to%20visualize%20thousanddimensional%20vectors%20in%20three-dimensional%20space.%22%2C%22blogTitle%22%3A%22%22%2C%22date%22%3A%222019%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fphilippschmitt.com%5C%2Fwriting%5C%2Fon-being-a-vector-inside-a-neural-network%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A22%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22H5HI9VHW%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Thompson%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EThompson%2C%20Nicholas.%20%26%23x201C%3BAn%20AI%20Pioneer%20Explains%20the%20Evolution%20of%20Neural%20Networks.%26%23x201D%3B%20%3Ci%3EWired%3C%5C%2Fi%3E%2C%202019.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fwww.wired.com%5C%2Fstory%5C%2Fai-pioneer-explains-evolution-neural-networks%5C%2F%27%3Ehttps%3A%5C%2F%5C%2Fwww.wired.com%5C%2Fstory%5C%2Fai-pioneer-explains-evolution-neural-networks%5C%2F%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DH5HI9VHW%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22magazineArticle%22%2C%22title%22%3A%22An%20AI%20Pioneer%20Explains%20the%20Evolution%20of%20Neural%20Networks%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Thompson%22%7D%5D%2C%22abstractNote%22%3A%22Google%27s%20Geoff%20Hinton%20was%20a%20pioneer%20in%20researching%20the%20neural%20networks%20that%20now%20underlie%20much%20of%20artificial%20intelligence.%20He%20persevered%20when%20few%20others%20agreed.%22%2C%22date%22%3A%222019%22%2C%22language%22%3A%22en%22%2C%22ISSN%22%3A%221059-1028%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.wired.com%5C%2Fstory%5C%2Fai-pioneer-explains-evolution-neural-networks%5C%2F%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-08-20T21%3A06%3A50Z%22%7D%7D%2C%7B%22key%22%3A%22RDYEVCBJ%22%2C%22library%22%3A%7B%22id%22%3A4390815%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Underwood%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EUnderwood%2C%20Ted.%20%26%23x201C%3BScience%20Fiction%20Hasn%26%23x2019%3Bt%20Prepared%20Us%20to%20Imagine%20Machine%20Learning.%26%23x201D%3B%20%3Ci%3EThe%20Stone%20and%20the%20Shell%3C%5C%2Fi%3E%20%28blog%29%2C%202021.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Ftedunderwood.com%5C%2F2021%5C%2F02%5C%2F02%5C%2Fwhy-sf-hasnt-prepared-us-to-imagine-machine-learning%5C%2F%27%3Ehttps%3A%5C%2F%5C%2Ftedunderwood.com%5C%2F2021%5C%2F02%5C%2F02%5C%2Fwhy-sf-hasnt-prepared-us-to-imagine-machine-learning%5C%2F%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Falanyliu.org%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4390815%26amp%3Bitem_key%3DRDYEVCBJ%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22blogPost%22%2C%22title%22%3A%22Science%20fiction%20hasn%5Cu2019t%20prepared%20us%20to%20imagine%20machine%20learning.%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ted%22%2C%22lastName%22%3A%22Underwood%22%7D%5D%2C%22abstractNote%22%3A%22%5BExcerpt%3A%5D%20And%20yes%2C%20to%20be%20sure%2C%20deep%20learning%20is%20in%20its%20infancy%20and%20will%20be%20improved%20by%20modeling%20larger-scale%20patterns.%20On%20the%20other%20hand%2C%20it%20would%20be%20foolish%20to%20ignore%20early%20clues%20about%20what%20it%5Cu2019s%20good%20for.%20There%20is%20something%20bizarrely%20parochial%20about%20a%20view%20of%20mental%20life%20that%20makes%20predicting%20a%20nineteenth-century%20writer%5Cu2019s%20thoughts%20about%20Twitter%20less%20interesting%20than%20stacking%20boxes%20to%20reach%20bananas.%20Perhaps%20it%5Cu2019s%20a%20mistake%20to%20assume%20that%20advances%20in%20machine%20learning%20are%20only%20interesting%20when%20they%20resemble%20our%20own%20%28supposedly%20%5Cu201cgeneral%5Cu201d%29%20intelligence.%20What%20if%20intelligence%20itself%20is%20overrated%3F%22%2C%22blogTitle%22%3A%22The%20Stone%20and%20the%20Shell%22%2C%22date%22%3A%222021%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Ftedunderwood.com%5C%2F2021%5C%2F02%5C%2F02%5C%2Fwhy-sf-hasnt-prepared-us-to-imagine-machine-learning%5C%2F%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222021-09-02T21%3A47%3A03Z%22%7D%7D%5D%7D
Ajayi, Demi. “How BERT and GPT Models Change the Game for NLP.” Watson Blog (blog), 2020. https://www.ibm.com/blogs/watson/2020/12/how-bert-and-gpt-models-change-the-game-for-nlp/. Cite
Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 610–23. FAccT ’21. Virtual Event, Canada: Association for Computing Machinery, 2021. https://doi.org/10.1145/3442188.3445922. Cite
Heuser, Ryan. “Word Vectors in the Eighteenth Century.” In DH 2017. Montreal: McGill University, Université de Montréal, Alliance of Digital Humanities Organizations (ADHO), 2017. https://dh2017.adho.org/abstracts/582/582.pdf. Cite
Hua, Minh, and Rita Raley. “Playing With Unicorns: AI Dungeon and Citizen NLP.” Digital Humanities Quarterly 014, no. 4 (2020). Cite
King, Adam Daniel. “InferKit Demo,” 2020. https://app.inferkit.com/demo. Cite
Moore, Samuel K., David Schneider, and Eliza Strickland. “How Deep Learning Works.” IEEE Spectrum, 2021. https://spectrum.ieee.org/what-is-deep-learning/neural-network. Cite
Nicholson, Chris. “A Beginner’s Guide to Generative Adversarial Networks (GANs).” Pathmind, 2020. http://wiki.pathmind.com/generative-adversarial-network-gan. Cite
Nicholson, Chris. “A Beginner’s Guide to Neural Networks and Deep Learning.” Pathmind, 2020. http://wiki.pathmind.com/neural-network. Cite
Offert, Fabian. “Intuition and Epistemology of High-Dimensional Vector Space.” Fabian Offert, 2021. https://zentralwerkstatt.org/blog/vsm. Cite
Offert, Fabian, and Peter Bell. “Generative Digital Humanities.” In CEUR Workshop Proceedings, 202–12. Amsterdam: CEUR-WS.org, 2020. http://ceur-ws.org/Vol-2723/short23.pdf. Cite
Schmidt, Benjamin. “Vector Space Models for the Digital Humanities.” Bookworm (blog), 2015. http://bookworm.benschmidt.org/posts/2015-10-25-Word-Embeddings.html. Cite
Schmitt, Phillipp. I Am Sitting In A High-Dimensional Room. 2020. 2-channel audio simulation, 22:16 min. https://philippschmitt.com/work/i-am-sitting-in-a-high-dimensional-room. Cite
Schmitt, Phillipp. “On Being a Vector inside a Neural Network,” 2019. https://philippschmitt.com/writing/on-being-a-vector-inside-a-neural-network. Cite
Thompson, Nicholas. “An AI Pioneer Explains the Evolution of Neural Networks.” Wired, 2019. https://www.wired.com/story/ai-pioneer-explains-evolution-neural-networks/. Cite
Underwood, Ted. “Science Fiction Hasn’t Prepared Us to Imagine Machine Learning.” The Stone and the Shell (blog), 2021. https://tedunderwood.com/2021/02/02/why-sf-hasnt-prepared-us-to-imagine-machine-learning/. Cite