{"id":60964,"date":"2024-03-28T16:36:16","date_gmt":"2024-03-28T11:06:16","guid":{"rendered":"https:\/\/www.tothenew.com\/blog\/?p=60964"},"modified":"2024-06-10T15:32:12","modified_gmt":"2024-06-10T10:02:12","slug":"leveraging-genai-for-enhanced-content-creation-in-aem","status":"publish","type":"post","link":"https:\/\/www.tothenew.com\/blog\/leveraging-genai-for-enhanced-content-creation-in-aem\/","title":{"rendered":"Leveraging GenAI for Enhanced Content Creation in AEM"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">There is no doubt that our generation has to go hand in hand with generative AI capabilities.\u00a0 It brings in efficiency in day to day work by super accelerating the launch of new products and features in any kind of an organization.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Today here in this blog, we delve into the ways we can use external AI models within AEM to bring efficiency into the day to day work of an AEM content-author or AEM developer.<\/span><\/p>\n<h3><b>DALL-E 2\/3 from openAI<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">DALL-E is a transformer language model just like GPT3.\u00a0 We all are now familiar with chatGPT and its capabilities.\u00a0 Now how do we bring these capabilities into AEM ? <\/span><\/p>\n<h3><b>Adobe Sensei<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">Obviously, Adobe has its own sensei which brings in AI capabilities within Adobe\u2019s creative cloud applications, Adobe Express and Adobe firefly.\u00a0 All these are licensed products and the extreme AI capabilities of firefly can\u2019t be utilized within AEM Assets out of the box as of yet and we don\u2019t have a timeline when it will be available.\u00a0 By extreme capabilities I mean to say the text to image generation, text to image modification features.\u00a0 So what does a customer do with only AEM Sites or AEM Assets license ?<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Here is our solution to it.<\/span><\/p>\n<h2><b>AEM custom component<\/b><i><span style=\"font-weight: 400;\">\u00a0<\/span><\/i><\/h2>\n<p><span style=\"font-weight: 400;\">Here is a custom component in AEM to give the customer a GPT like interface and environment\u00a0 to get indulged in an AI enabled content creation and modification process right within AEM.<\/span><\/p>\n<p><i><span style=\"font-weight: 400;\">Image Generation<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">Our custom component interacts with the DALL-E model to generate images from text prompts and then allows the user to save the image within AEM DAM.\u00a0\u00a0<\/span><\/p>\n<p><i><span style=\"font-weight: 400;\">Image Modification<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">The second stage of it is to have the capability to fetch an image from AEM DAM directly and customize it with text prompts.\u00a0 The modified image will get saved in DAM.<\/span><\/p>\n<p><i><span style=\"font-weight: 400;\">Flexibility of choosing backend Models<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">In the above ways as we mentioned, there is also going to be flexibility with the end user to choose what models would the user like to use.\u00a0 With the availability of public LLMs ( Large Language Models ) we can leverage more and more genAI capabilities into whatever way we want them to be.\u00a0\u00a0<\/span><\/p>\n<h2><b>Background to the above processes<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">Some of the organisations hosting the TLMs (transformer language models) do have a user subscription model to interact with their models for genAI capabilities through authenticated API requests.\u00a0 This is how they become immensely popular day by day and we can also leverage this opportunity to ease our daily work in AEM.\u00a0 \u00a0<\/span><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-medium wp-image-60962\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2024\/03\/unnamed-1-300x101.png\" alt=\"\" width=\"300\" height=\"101\" srcset=\"\/blog\/wp-ttn-blog\/uploads\/2024\/03\/unnamed-1-300x101.png 300w, \/blog\/wp-ttn-blog\/uploads\/2024\/03\/unnamed-1.png 337w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-medium wp-image-60963\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2024\/03\/Screenshot-from-2024-03-27-12-26-23-300x251.png\" alt=\"\" width=\"300\" height=\"251\" srcset=\"\/blog\/wp-ttn-blog\/uploads\/2024\/03\/Screenshot-from-2024-03-27-12-26-23-300x251.png 300w, \/blog\/wp-ttn-blog\/uploads\/2024\/03\/Screenshot-from-2024-03-27-12-26-23-768x642.png 768w, \/blog\/wp-ttn-blog\/uploads\/2024\/03\/Screenshot-from-2024-03-27-12-26-23-624x522.png 624w, \/blog\/wp-ttn-blog\/uploads\/2024\/03\/Screenshot-from-2024-03-27-12-26-23.png 920w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><b>References<\/b><\/p>\n<p><a href=\"https:\/\/platform.openai.com\/docs\/guides\/images?context=node\"><span style=\"font-weight: 400;\">https:\/\/platform.openai.com\/docs\/guides\/images?context=node<\/span><\/a><span style=\"font-weight: 400;\">\u00a0<\/span><\/p>\n<div class=\"ap-custom-wrapper\"><\/div><!--ap-custom-wrapper-->","protected":false},"excerpt":{"rendered":"<p>There is no doubt that our generation has to go hand in hand with generative AI capabilities.\u00a0 It brings in efficiency in day to day work by super accelerating the launch of new products and features in any kind of an organization. Today here in this blog, we delve into the ways we can use [&hellip;]<\/p>\n","protected":false},"author":1755,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"iawp_total_views":232},"categories":[5868],"tags":[1001,5733,5734,5673],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/60964"}],"collection":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/users\/1755"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/comments?post=60964"}],"version-history":[{"count":2,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/60964\/revisions"}],"predecessor-version":[{"id":61050,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/60964\/revisions\/61050"}],"wp:attachment":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/media?parent=60964"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/categories?post=60964"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/tags?post=60964"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}