{"id":1072,"date":"2022-12-07T11:36:52","date_gmt":"2022-12-07T11:36:52","guid":{"rendered":"https:\/\/www.denizcemonduygu.com\/philo\/?p=1072"},"modified":"2022-12-07T11:36:52","modified_gmt":"2022-12-07T11:36:52","slug":"getting-help-from-ai-to-tag-sentences","status":"publish","type":"post","link":"https:\/\/www.denizcemonduygu.com\/philo\/getting-help-from-ai-to-tag-sentences\/","title":{"rendered":"Getting Help From AI to Tag Sentences"},"content":{"rendered":"<p><b>I\u2019ve recently <\/b><a href=\"https:\/\/www.denizcemonduygu.com\/philo\/interface-content-upgrade-tags-improved-search-dark-mode-and-more\/\" target=\"_blank\" rel=\"noopener noreferrer\"><b>announced<\/b><\/a><b> a new feature I call \u201ctags\u201d <\/b>\u2013 keywords, shown on the left of the sentences, including names of specific arguments, theories, -isms, etc. which don\u2019t appear in the sentence itself. Tagging, I had said, was an ongoing manual process I\u2019d been carrying out for some time. <b>I had tagged 525 sentences myself <\/b>by going through the total of 1603 using the wet, mushy neural networks in my brain.<\/p>\n<p>Even though I have intentionally avoided using any kind of automatization for my editorial work in this project since the beginning, I got curious about the kind of help I could get from AI applications for the relatively easy task of tagging sentences. <b>So I plugged <\/b><a href=\"https:\/\/openai.com\/api\/\" target=\"_blank\" rel=\"noopener noreferrer\"><b>OpenAI<\/b><\/a><b>\u2019s <\/b><a href=\"https:\/\/en.wikipedia.org\/wiki\/GPT-3\" target=\"_blank\" rel=\"noopener noreferrer\"><b>Generative Pre-trained Transformer 3<\/b><\/a><b> (GPT-3) into my database.<\/b><\/p>\n<p>GPT-3 is \u201can <a href=\"https:\/\/arxiv.org\/abs\/2005.14165\" target=\"_blank\" rel=\"noopener noreferrer\">autoregressive language model<\/a> with 175 billion parameters, 10\u00d7 more than any previous non-sparse language model\u201d. It is <a href=\"https:\/\/dailynous.com\/2020\/07\/30\/philosophers-gpt-3\/#chalmers\" target=\"_blank\" rel=\"noopener noreferrer\">described<\/a> as \u201cone of the most interesting and important AI systems ever produced\u201d by David Chalmers, currently the youngest (56) living philosopher listed in my project. <b>GPT-3 has an <\/b><a href=\"https:\/\/openai.com\/blog\/openai-api\/\" target=\"_blank\" rel=\"noopener noreferrer\"><b>API<\/b><\/a><b> which returns a completion for any text prompt you give, attempting to match the pattern. The prompt I designed for the task went like this:<\/b><\/p>\n<h5 style=\"padding-left: 40px; margin-bottom: 5px;\">Sentence: There are two kinds of substance: mind (which I am as a conscious being; \u2018res cogitans\u2019) and matter (\u2018res extensa\u2019).<\/h5>\n<h5 style=\"padding-left: 40px;\">Category: Dualism<\/h5>\n<h5 style=\"padding-left: 40px; margin-bottom: 5px;\">Sentence: There are no innate certainties to be discovered in our minds, all knowledge comes from experience.<\/h5>\n<h5 style=\"padding-left: 40px;\">Category: Empiricism<\/h5>\n<h5 style=\"padding-left: 40px; margin-bottom: 5px;\">Sentence: The universe consists of matter in motion, and nothing else exists; it is a vast machine.<\/h5>\n<h5 style=\"padding-left: 40px;\">Category: Materialism<\/h5>\n<h5 style=\"padding-left: 40px; margin-bottom: 5px;\">Sentence: \u2026<\/h5>\n<h5 style=\"padding-left: 40px;\">Category<\/h5>\n<p>Thanks to this prompt and a few parameter settings, GPT-3 was able to \u201cunderstand\u201d what I was asking from it and to work just as I intended it to: <b>it read the prompt and produced completions to it 1603 times, each time with a different sentence<\/b> from my collection substituting the \u201c&#8230;\u201d in the prompt (thanks to a function I created in Google Sheets Apps Script based on <a href=\"https:\/\/www.seotraininglondon.org\/gpt3-google-sheets-free-tutorial\/\" target=\"_blank\" rel=\"noopener noreferrer\">Richman\u2019s<\/a>), thus offering 1603 suggestions for categories. <b>But how successful was it, from a human point of view?<\/b><\/p>\n<p><code class=\"rl-shortcode\"><div class=\"rl-gallery-container rl-loading\" id=\"rl-gallery-container-1\" data-gallery_id=\"1093\"> <ul class=\"rl-gallery rl-basicslider-gallery \" id=\"rl-gallery-1\" data-gallery_no=\"1\"> <li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags1-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags2-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags3-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags4-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags5-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags7-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags8-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags10-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags11-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags12-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags13-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags14-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags15-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags16-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags17-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags18-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li><li class=\"rl-gallery-item\"><a href=\"javascript:void(0)\" style=\"cursor: default;\" title=\"\" data-rl_title=\"\" class=\"rl-gallery-link\" data-rl_caption=\"\" data-rel=\"norl\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags19-1170x771.png\" width=\"1170\" height=\"771\" \/><\/a><\/li> <\/ul> <\/div><\/code><\/p>\n<p>&nbsp;<\/p>\n<p>Overall, I was impressed. <b>Out of the 525 sentences that I myself had already tagged, GPT-3 suggestions matched 178 (33.9%).<\/b> (This means that my old tags, which might be more than one for a sentence, included the ones by GPT-3 for 178 sentences. Note that it didn&#8217;t know what my old tags were except for the few in the prompt.) <strong>I put GPT-3\u2019s suggestions in a column next to my tags and sentences, went through the 1603 lines one by one<\/strong> (two times) and made additions to my tags where the suggestions made sense to me. In the end,<b> the total count of tagged sentences went from 525 up to 705.<\/b> I made changes in the tags of 242 sentences, <i>mostly<\/i> thanks to GPT-3 \u2013 I want to explain \u201cmostly\u201d here.<\/p>\n<p>I noticed <b>three different ways in which GPT-3 influenced me to add tags:<\/b><\/p>\n<ol>\n<li aria-level=\"1\">GPT-3 suggesting terms I was not familiar with \/ would not think of by myself (rare)<\/li>\n<li aria-level=\"1\">GPT-3 suggesting terms I was familiar with but hadn\u2019t thought of as tags previously<\/li>\n<li aria-level=\"1\">GPT-3 suggesting terms I had thought of as tags before but wasn\u2019t sure enough to add, thus encouraging me to add them<\/li>\n<\/ol>\n<p>The fact that 242 sentences had additions to their tags with this procedure doesn\u2019t mean they were all directly suggested by GPT-3: going through the 1603 sentences again and again in order to compare with the suggested tags, <b>I had moments of coming up with new tags myself<\/b>, independently of GPT-3\u2019s suggestions. There were also cases where I took a suggestion by GPT-3 for a sentence and used it for other sentences (with different GPT-3 tags), sometimes excluding the original sentence it suggested the tag for. <b>The number of cases where I implemented (with\/without adjustments) a GPT-3 suggestion specifically made for that sentence is 157<\/b>; if we include its indirect influences where I took its suggestions and applied to other sentences, we can say that <b>the total number of sentences to which I added tags <\/b><b><i>thanks to GPT-3<\/i><\/b><b> is 186, which amounts to 11.6% of the sentences.<\/b><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1130\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01.png\" alt=\"\" width=\"1490\" height=\"593\" srcset=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01.png 1490w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-300x119.png 300w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-768x306.png 768w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-1024x408.png 1024w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-1170x466.png 1170w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-870x346.png 870w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-01-490x195.png 490w\" sizes=\"(max-width: 1490px) 100vw, 1490px\" \/><br \/>\nSo the help I got from GPT-3 was not exactly <i>new information <\/i>to me but mostly nudges here and there, making me remember\/appreciate or convincing me about some terms as tags, for the 11.6% of the sentences. <b>You may be unimpressed when it\u2019s phrased like this but this was just the kind and the amount of contribution I was hoping for<\/b>, since I had already done the bulk of the work myself. And I\u2019m grateful to be able to get such help with editorial work in <i>philosophy <\/i>with this <i>small amount of money <\/i>($5)<i> and effort <\/i>(designing a prompt and creating a function in Sheets, without the need for training a custom model) this <i>quickly <\/i>(less than a minute to run for the 1603 sentences).<\/p>\n<p>To be sure, <b>the numbers of additions above are dependent on the criteria of my project and they widely underestimate GPT-3\u2019s success <\/b>because there are many cases where:<\/p>\n<ul>\n<li aria-level=\"1\">GPT-3 suggests accurate information that just doesn\u2019t fit into my definition of tags for this project: it correctly identifies names of philosophers or -isms generated from them (e.g. <i>Kant<\/i> or <i>Kantianism <\/i>for <a href=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags6s.png\" target=\"_blank\" rel=\"noopener noreferrer\" data-rel=\"lightbox-image-0\" data-rl_title=\"\" data-rl_caption=\"\" title=\"\">Kant\u2019s own sentences<\/a>), or suggests accurate terms that I find not precise (i.e. interesting) enough for my tags (e.g. <i>theism<\/i>).<\/li>\n<li aria-level=\"1\">GPT-3 suggests one correct tag for the majority of a philosopher\u2019s sentences (e.g.\u00a0<i>existentialism for<\/i> <a href=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags9s.png\" data-rel=\"lightbox-image-1\" data-rl_title=\"\" data-rl_caption=\"\" title=\"\">Heidegger&#8217;s<\/a>) but I choose to act stingy in those cases by tagging just a few sentences that best represent the -ism in question.<\/li>\n<li aria-level=\"1\">GPT-3 suggests an okay tag (e.g. <i>existentialism <\/i>for sentences by Camus) but I have a more accurate\/precise one (<i>absurdism<\/i>) so I don\u2019t change it.<\/li>\n<li aria-level=\"1\">GPT-3 makes a good suggestion capturing the idea in the sentence, but it would be historically inappropriate to use that tag for that philosopher. (Remember, it doesn\u2019t know who the sentences belong to.)<\/li>\n<li aria-level=\"1\">GPT-3 gets a keyword right, but the sentence actually disputes that argument\/theory\/-ism, so I don\u2019t want to use it as a tag misrepresenting the position of that sentence.<\/li>\n<\/ul>\n<p>If we count positive cases such as these \u2013 <b>if we count all GPT-3\u2019s <\/b><b><i>good guesses<\/i><\/b><b>, ignoring their precision, historical awareness, and usefulness with regard to my specific editorial standards and choices<\/b> \u2013 <b>the success rate of GPT-3 becomes 71.4%! (1144 sentences out of 1603) <\/b>(Yes, I did another round just to determine this number. I was that curious.)<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1091 size-full\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph.png\" alt=\"\" width=\"1500\" height=\"1484\" srcset=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph.png 1500w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-300x297.png 300w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-768x760.png 768w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-1024x1013.png 1024w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-100x100.png 100w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-1170x1158.png 1170w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-870x861.png 870w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-graph-490x485.png 490w\" sizes=\"(max-width: 1500px) 100vw, 1500px\" \/><\/p>\n<p>I also got <b>another type of corroboration by GPT-3<\/b> for my previous efforts of tagging: <b>for the majority of the sentences for which I had little hope of coming up with useful tags, GPT-3 also fails to offer them.<\/b> It either suggests generic terms (e.g. <i>aesthetics <\/i>for sentences about various positions in aesthetics, which actually is a whole branch in the Filters in the Menu) or terms that I could not relate to the sentence (<i>idealism<\/i> is one of its favorite wild cards). This confirms what I had written in the previous post, that <b>not every sentence is suitable for tagging.<\/b><\/p>\n<p>We can highlight three numbers to sum up the situation from my and GPT-3\u2019s perspectives:<\/p>\n<ul>\n<li aria-level=\"1\"><b>The actual help I got from GPT-3 (direct + indirect additions): 11.6% <\/b>(186 sentences)<\/li>\n<li aria-level=\"1\"><b>The strict success rate of GPT-3 by my standards (direct additions + matches): 20.9% <\/b>(335 sentences)<\/li>\n<li aria-level=\"1\"><strong>The relaxed success rate of GPT-3 (good guesses): 71.4%<\/strong> (1144 sentences)<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1129\" src=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02.png\" alt=\"\" width=\"1490\" height=\"706\" srcset=\"https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02.png 1490w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-300x142.png 300w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-768x364.png 768w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-1024x485.png 1024w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-1170x554.png 1170w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-870x412.png 870w, https:\/\/www.denizcemonduygu.com\/philo\/wp-content\/uploads\/2022\/12\/HoPV-GPT-tags-bar-chart-02-490x232.png 490w\" sizes=\"(max-width: 1490px) 100vw, 1490px\" \/><\/p>\n<p>Please keep in mind, while looking at all these numbers, that there might well be instances where I&#8217;ve misconstrued a good suggestion by GPT-3 and haven\u2019t used\/counted it \u2013 I\u2019m human! \u2013 but I believe these figures are good rough measures to get a grasp of its competence for a task like this. It is also worth noting that the final number of tagged sentences (705) could be given some weight here when judging GPT-3\u2019s (and my) success at tagging: since not every sentence is (objectively or project-specifically) suitable for tagging, <b>maybe the numbers should be proportioned to 705 instead of 1603 \u2013 making the strict success rate of GPT-3 <\/b><b><i>by my standards<\/i><\/b><b> (335 sentences) jump from 20.9% to 47.5%.<\/b> Of course I don\u2019t believe 705 is really the final number and no other sentence can be tagged in a meaningful way (I\u2019ve already done a few changes\/additions since I wrote this post) but this calculation may contribute to a more balanced picture at this stage.<\/p>\n<p>Witnessing its usefulness first-hand, I wanted to write this detailed post as (1) a documentation of GPT-3\u2019s modest contribution to my project and (2) an evaluation of GPT-3\u2019s impressive capabilities for tasks like these. I agree now more than before that <b>people are not overstating it when they call it a revolution<\/b> and I\u2019m sure many of you will soon be benefiting from <a href=\"https:\/\/gpt3demo.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">systems like these<\/a> especially when they will be readily <a href=\"https:\/\/twitter.com\/shubroski\/status\/1587136794797244417\" target=\"_blank\" rel=\"noopener noreferrer\">integrated<\/a> into basic software that we use daily.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I\u2019ve recently announced a new feature I call \u201ctags\u201d \u2013 keywords, shown on the left of the sentences, including names of specific arguments, theories, -isms, etc. which don\u2019t appear in the sentence itself. Tagging, I had said, was an ongoing manual process I\u2019d been carrying out for some time. I had tagged 525 sentences myself by going through the total of 1603 using the wet, mushy neural networks in my [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"footnotes":""},"categories":[3],"tags":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p5pEAz-hi","_links":{"self":[{"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/posts\/1072"}],"collection":[{"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/comments?post=1072"}],"version-history":[{"count":28,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/posts\/1072\/revisions"}],"predecessor-version":[{"id":1138,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/posts\/1072\/revisions\/1138"}],"wp:attachment":[{"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/media?parent=1072"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/categories?post=1072"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.denizcemonduygu.com\/philo\/wp-json\/wp\/v2\/tags?post=1072"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}