teleportHQ Sketch plugin
Criss Moldovan
Criss Moldovan@crissmoldovan
Posted on May 25

Hello! we’ve built a real-time Sketch-to-code engine (with live preview)

Hello world

quick questions:

Aren’t (Web / Mobile) Designers and Developers supposed to speak the same language?

Aren’t we all in pursuit of apps / sites that look pretty and work smoothly?

Can we make the whole mobile / web building process a bit more humane?

There’s clearly a gap between designers and developers. This seems all too familiar, and to my surprise it seems we’ve been mostly trying to navigate around it instead of facing it head on.

Can we right this wrong?

About a year back, as I was working on a mobile app, I got myself stuck in a never ending ping-pong with my designer, mainly tweaking how the app looks and behaves. Most of these changes were only about positioning, colors, spacing and so on. Instead of using my time to implement the real value bringing guts of the app, about 60% was spent doing non-business-logic relevant work aka “pixel pushing”. Essentially, all the “actions” the designer was doing in Sketch, I had to re-do in code. That, to me, looked like a completely inefficient process and the paved way to frustration. And a missed deadline. Or two :)

My first thought was “I’m probably not using the right tools… there must be a way…”

Not being a designer myself, and fairly unfamiliar with Sketch, I started to look around for some solutions that could help, but all I could find were plugins or apps that were doing part of the job. So, why bother using something that would only add more complexity to an already inefficient process

Hang on, so… what are we actually looking for?

in a nutshell this is what I would like:

the designer should be able to “preview” the designs on his device / browser in realtime. But, not simulated. He should see the preview as the real deal, as if a developer would have implemented it for him.

when the designer draws a button, the developer should get a “Button” code snippet ready to be used in the app / site

The developer should chose the snippet’s language and dialect or coding style.

It looks like there are some excellent tools to cover parts of the requirements, and they fall within two categories:  prototyping tools  and  target specific “helpers”.

While tools such as  InVision  and lately  Sketch’s own built in prototyping system are doing a great job for prototyping, they only go as far as “prototyping”. They are mostly approximations or simulations of the end result, and they stop being useful right there.There’s not much more you can do with your prototypes once you have built them(apart from contemplate their beauty).They cannot be re - used or further expanded toward a “real”, production ready result.Hence you now need a developer to code the UI, so back to square one.

On the other hand, Target specific tools, such as  Zeplin, do a great deal in helping the developer “pick” styling information, but they rarely give the full context, plus they are exactly what they are called “target specific”, with hardly any configurations possible.

Another worth mentioning approach has been proposed by the guys from Anima, through their Launchpad plugin for Sketch which exports plain HTML/CSS.

While there are plenty of tools that’ll get you a fair bit ahead towards the goal, the designer still needs a developer to be able to experience his designs in the “native world”, plus a lot of manual tweaking is needed afterwards.

Then, the thought:

What would it take to capture the designer’s “input” and translate it to code, in real-time?

And this was the seed thought that brought together a bunch of techies whom I met at the  2017 JSHeroes conference  in Cluj, Romania.

The shortest way we could imagine about how to tackle this challenge was to try to define the layout representation in a code-agnostic format from which, through a parser of some sort, we would generate the code.

Given we are describing a web / mobile document’s structure, a  VDOM- like model would have been the initial choice, but somehow coupled with some concepts found in  AST. We finally opted for a custom stripped down structure inspired by them. The JSON representation appeared to be the format of choice for the task at that time, given that:  

  • type:  a string that defines the nature of the element. Regardless if we speak about web or mobile design & development, the building blocks are roughly the same: views, texts, images, inputs, and buttons that are eventually aggregated into more complex elements. We’d go as far as to say these building blocks are here to stay even if we go into the AR / MR world, so a generic descriptive naming convention can be agreed that could be translated via a mapping to any target.
  • styles:  a  JSS  object. JSS has been chosen as it covers all web styling properties and can be translated to other formats, such as CSS, React styling objects, React Native StyleSheet objects, etc…
  • children:  an array of elements or a string (in the case of a simple label for example)

To make all this more visual, let’s take an example. The following image shows a basic UI made of a box, that contains a label and an image:

Hello World

We’ll publish soon more information about how they work but, for now, let’s look at how a React generated code would look like for our JSON IR:

Alternatively, the React-Native code would look like this:

NOTE: in this example the positioning is deliberately set to absolute for the sake of simplicity.We’ll cover this topic in a future article given the complexity of the subject.Meanwhile, you can take a look at Karl’s article about Figma to React code generation.

Sneak Peek

Here’s a little demo of how this all works. In the video you can see our early stage Sketch plugin in action and how the design-source > intermediary representation > code generation > live preview flow feels.

The video is a simulation of a designer’s experience building JSHeroes website’s menu.

On the screen:

  • the left side: our dashboard shows the JSON IR and generated React code, side-by-side
  • the top right corner: a web live previewer
  • the lower right: Sketch with the Teleport plugin loaded.

Wrap ups

So far, we confirmed there’s a viable technological path for building real-time design-to-code experiences through which the design source and the target code can be completely decoupled.

So far, we’re able to generate React, React Native, Vue, HTML/CSS and AngularJS code.

Sure, the holy grail would be a bidirectional approach, where JSON IR could be generated by interpreting the source code, hence we love Jon Gold’s approach in  React Sketch App.We’d highly recommend checking out the project.

We also aim to open source these tools ASAP, via Github on  so, stay tuned.

We’re looking forward for your thoughts and feedback, so, please get in touch with us via  Twitter.

Our blog’s code is automatically generated from a  teleport project definition. The blog is open-source and you can learn more about how the technology works from our  github repo.

Evo Forge, Calea Motilor nr 84, Cluj-Napoca      Phone: +40 (0)364 101 203