Facing the XM Cloud Adoption Challenge
Ahmed Okour and Alex Lentz delivered a very interesting session at Sitecore Symposium 2024 named “Accelerate Sitecore XM Cloud adoption using generative AI”. The challenge was to use generative AI to find ways to get to XM Cloud faster by converting MVC over to Next.js, or by going from Figma or design images and generating the code.
One of the first tips that the pair provided was that for any developers introducing generative AI chat tools into their flow, you absolutely need to be revising your prompts. You can even ask the AI itself to help you improve your prompts by asking it to explain what it understood the request to be. This can help you get better outputs by having clearer and more refined prompts.
When looking at the tools available out there, their team outlined several:
- LLM Models
- OpenAI (ChatGPT): Their models are very good for general knowledge, but not the best for tasks like coding.
- Anthropic (Claude): Their models are more applicable to human tasks, like coding.
- Developer Tools
- Vercel AI SDK: Full SDK to build out AI features into your apps.
- Design to Code
- Locofy AI: Can take designs and map it into code in your target language.
- Visual Copilot (by Builder.io): Figma-to-code tooling.
- Anima: With a Figma, Adobe XD, and VS Code plugins.
Sitecore JSS Copilot
Ahmed and Alex then introduced their Sitecore JSS Copilot project. The tool has 2 main uses:
- Design to Code: This allows developers to take a component design and convert it into a Next.js component using JSS that can be used in your XM Cloud solution.
- Code Conversion: To convert older non-headless Sitecore code to an XM Cloud headless equivalent.
The tool is publicly available for free, but it comes with limitations. You must be sign in with a Google account to use it and you can only ask for 25 conversions.
The general settings allow to select the model between Anthropic Claude 3.5 Sonnet and OpenAI GPT-4o, and provide custom prompt instructions that are included in all conversion requests.
Design to Code
In this mode, you can upload a component Figma design, a screenshot or a wireframe image.
The results are 2 Next.js components. The first one is a plain React component receiving other components as props for the fields. The second is a Sitecore JSS component wrapping the previous React component by setting the props as the Sitecore fields rendered with the JSS primitive field components like <Text>
and <Link>
. This allows easier unit tests of the component without having to mock the JSS fields and their values.
At the moment, only Next.js is available for the JS framework selection. Tailwind, CSS modules, and Styled components are available for styling.
If you are not entirely happy with the result, you can even chat with the Copilot to refine and iterate on the work to get the generation to where you want it to be!
This mode also has the ability to preview the component on mobile, tablet, and desktop breakpoints so that you can validate that the design is coming across as expected.
I find this to be really useful for fresh “greenfield” projects where you are starting a new project from a creative design. It is also a great tool when a customer wants to implement a redesign to their existing site, or even if you just wanted to spin up a few quick landing pages without involving a lot of dev time.
Code Conversion
The second mode has the ability to convert older Sitecore code to XM Cloud compatible code:
- ASP.Net MVC Razor to Next.js component using JSS.
- Sitecore SXA Scriban to Next.js component using JSS.
- Rendering Content Resolver code to a GraphQL query
Unlike the design to code mode, the code conversion mode does not allow to chat with the Copilot to refine the result. It also only generates a single component instead of two.
Future State
Their roadmap includes adding a user library of generated or saved code, and also the ability to preview a generated component on your own JSS site.
I left the session with the understanding that the code was open-source, allowing anyone to clone it, run it with their own LLM API keys, and use it without limits. Unfortunately, the repository is currently private. I contacted one of the authors and will update this blog if it becomes a possibility.
Try it out right now: https://jss-copilot.vercel.app/