IMPULSE #8: ProtoPie 101 Crash Course – Part 2

After completing the first part of the ProtoPie 101 Crash Course, I was excited to continue my learning journey. Unit 2 focuses on creating micro-interactions, covering three key topics:

  1. Screen transitions
  2. Scrolling & paging
  3. Slide menus

The pace of the training picked up in this section, making it more engaging but also requiring more attention. Here are my takeaways from each topic.

Screen transitions

Trying out different transition types was interesting, and I particularly liked the smart transition, which is similar to Figma’s Smart Animate. It enables smooth custom transitions between two states as long as elements share the same name across scenes. Since this is a familiar concept for many designers, it makes ProtoPie feel intuitive right from the start.

A notable feature is the ability to add the system status bar of a specific smartphone frame without manually inserting it as a layer. This helps maintain consistent states and transitions between screens.

However, one limitation became clear: all transitions happen simultaneously without the ability to sequence animations. Thankfully, there’s a workaround. By animating elements first and then applying a smart transition, we can control the animation order. The instructor demonstrated using scale and move responses first before adding the transition.

Another useful feature introduced in this section was the reorder response, which allows changing the stacking order of layers. The four reorder options include:

  1. Move forward one in the stack
  2. Move backward one in the stack
  3. Move to the top
  4. Move to the bottom

Finally, adding a jump response enables smooth transitions between scenes. Once all responses are added, they can be sequenced independently in the timeline, making animations more dynamic and structured.

Three ways to do a screen transition:

  1. Built-in transitions
  2. Custom smart transitions
  3. Animating elements before applying a transition

Scrolling & paging

The crash course provided well-prepared pre-made material, which made following much easier.

A container is used to basically mask a scrollable area. Any container can be turned into a scrolling container, with three scrolling options:

  • Scroll (continuous movement in a direction)
  • Paging (stepwise scrolling)
  • None (no scrolling enabled)

Interestingly, the container tool is a standalone tool in the toolbar. The direction of scrolling can be set to vertical or horizontal and overscroll effects can be enabled or disabled. The process was extremely simple and intuitive!

Paging and carousel elements

With paging, it’s possible to position carousel elements stepwise in the center of the screen while scrolling horizontally. However, one issue emerged: the last item aligns with the screen’s edge instead of stopping at the center. To fix this, an invisible rectangle can be added at the end to create spacing. Initially, I thought this method is not elegant, but just moments later, the instructor introduced a better alternative – adjusting the properties panel. It’s great to see that ProtoPie anticipates these usability needs!

Constraints in containers

This was a short but important topic. By default, containers are anchored to the top-left corner and elements inside do not resize automatically. However, deselecting fixed height or width allows elements to resize proportionally. This feature is super useful for responsive design.

Three ways to create slide menus

As the instructor pointed out, there are multiple ways to achieve the same effect in ProtoPie. For slide menus, three trigger types can be used: drag, pull, and fling.

1. Fling trigger

The fling trigger was applied to a side menu layer, using a move response to shift it into view. However, there was a problem – the sidebar was off-screen, making it unclickable.

To fix this, the fling trigger can be applied to any visible object, affecting the sidebar indirectly. A better approach is to add a touch area, extending the interactive zone without requiring invisible layers. This is a clever and efficient way to improve usability!

To close the menu, a second fling trigger with a move response can be used or a reset response (as seen in Unit 1). One downside is that the trigger must fully complete before the menu moves, which feels slightly unnatural from a user perspective.

2. Pull trigger

The pull trigger works differently – it’s based on distance pulled rather than absolute position. This means the menu moves in sync with the user’s finger, making it a more intuitive way to implement slide menus. Additionally, when released, the menu will automatically snap open or closed based on its position.

The instructor also mentioned that the pull trigger is perfect for pull-to-refresh interactions, which is great to know!

3. Drag trigger

The drag trigger behaves similarly to fling but requires a move response. By default, dragging moves the object in any direction, but setting it to horizontal-only in the properties panel restricts movement.

A potential issue: Users can drag beyond the intended position. To fix this, custom limits can be set in the properties panel. However, the menu still doesn’t snap closed by itself.

To fix this, we use a touch-up trigger combined with a condition response. This allows defining a threshold – if the menu is more than 50% open, it stays open; otherwise, it snaps closed. Constraints make interactions much more flexible!

Further learning

At the end of this unit, the course provided additional tutorials showcasing what’s possible with ProtoPie. These examples were inspiring and raised my interest of advanced interactions. The unit concluded with another knowledge test, reinforcing the learning experience.

Conclusion

Completing Unit 2 of the ProtoPie 101 Crash Course was an exciting and rewarding experience. The structured approach made even complex interactions easy to follow and I now feel confident in creating sophisticated micro-interactions.

ProtoPie’s approach to prototyping continues to impress me – especially its flexibility, intuitive design and developer-friendly handoff process. I’m eager to continue learning and exploring more advanced features in the upcoming units.

Resources

https://learn.protopie.io/start

https://learn.protopie.io/course/protopie-101

https://cloud.protopie.io/p/1a8b65c2398caca10872b720?ui=true&scaleToFit=true&enableHotspotHints=true&cursorType=touch&mockup=true&bgColor=%23F5F5F5&bgImage=undefined&playSpeed=1

IMPULSE #7: ProtoPie 101 Crash Course – Part 1

Inspired by my colleagues, I thought signing up to the official ProtoPie 101 Crash Course would be a good preparation when it comes to the work piece in my master’s thesis. Since I possibly want to use sensor technology and the visual as well as auditory level of communication, ProtoPie promises to emulate what feels like a programmed app due to its advanced prototyping capabilities.

Getting started

Every chapter of the ProtoPie 101 Crash Course begins with a clear outline of „What you’ll need“ and „What you’ll learn.“ This approach helps set expectations and ensures users are prepared with the necessary tools and mindset before diving in.

Unit 1: Introduction to ProtoPie

ProtoPie introduces a unique conceptual model based on object + trigger + response. The logic is simple: If I tap the square layer, then it will move. This principle extends to multiple micro-interactions, like carousels.

The prototyping process in ProtoPie involves three key platforms:

  • Studio for creating prototypes
  • Player for displaying prototypes on mobile devices
  • Cloud for sharing and collaboration

To get started, ProtoPie outlines four essential steps:

  1. Add assets
  2. Make interactions
  3. Test & share
  4. Collaborate

The structure is intuitive, making the onboarding process smooth and easy to follow.

Step 1: Add assets

This step introduces knowledge tests with multiple-choice questions. For instance, I encountered this question:

„When you export an entire frame or artboard from your design tool, how does this get imported into ProtoPie?“

The answer options were:

  • Object in Scene
  • Scene
  • New Pie
  • Frame

At first, it was unclear what exactly constituted a scene, a frame, or an object in a scene. However, the narrator clarified these distinctions after the first question, making it much easier to understand.

Importing assets from Figma was incredibly smooth. Everything transferred seamlessly, making the process highly efficient.

Step 2: Make interactions

My task in this step was to create an interaction where an icon on a home screen wiggles after a long press, allowing position edits, and stops wiggling when exiting edit mode. To achieve this, I used:

  • Long press and tap as triggers
  • Rotate and stop as responses

Some standout features during this step included:

  • Duplicating triggers and responses, which made iteration much faster
  • The timeline feature, which allows responses to occur sequentially – this was an excellent addition
  • The reset response, which makes it easy to revert an interaction back to its initial state

Overall, creating interactions in ProtoPie felt logical, intuitive, and efficient.

Step 3: Test & share

Now came the phase of installing the ProtoPie Player app. Prototypes can be displayed by:

  • Connecting a mobile device to a computer via USB
  • Scanning a QR code when both devices are on the same WiFi network

The display experience was just as seamless as in Figma. Sharing prototypes was also straightforward. A link can be distributed with controlled access settings, determining:

  • Who can view the prototype
  • Who can download the file for further editing
  • What UI elements are visible when opening the link

This flexibility ensures that collaboration remains secure and structured.

Step 4: Collaborate

The course then introduced interaction recordings (formerly known as „interaction recipes“). Unlike other prototyping tools that attempt to generate code – often unsuccessfully – ProtoPie focuses on providing accurate interaction values for developers.

With interaction recordings, developers can:

  • Play and scroll through the timeline at different speed levels to fully understand the interaction
  • Share single recorded interactions via individual sub-links

Additionally, ProtoPie allows elements with interactions to be transformed into reusable components, similar to Figma. Once inside a component, interactions move with it. Components can also be uploaded to a team library, enabling:

  • Cloud-based collaboration
  • Editing in library mode with cloud-synced updates
  • Easy pasting of components from local files to the cloud team library (though changes need to be published before others see them)

Knowledge test & learning progress

The first chapter concludes with a knowledge test, designed to help measure comprehension. If results indicate gaps in understanding, the system suggests to do the chapter again before proceeding. I found this to be an excellent learning reinforcement tool. Additionally, test results can be downloaded for documentation, making it a useful feature for employer verification.

After completing this first chapter, I had reached 20% progress in the overall crash course. This made me eager to continue and explore the advanced capabilities of ProtoPie.

Final thoughts

My experience with the ProtoPie 101 Crash Course so far has been extremely positive. The course is structured in a way that makes learning both engaging and practical. The hands-on approach, combined with well-integrated knowledge tests, ensures that users truly get the concepts before moving forward.

ProtoPie’s approach to prototyping – focusing on visual interaction recordings instead of auto-generated code – stands out as a particularly developer-friendly and intuitive method. It eliminates ambiguity and allows for precise handoff between designers and engineers.

With 20% of the course completed, I am excited to see what’s next! Stay tuned for further insights as I continue my ProtoPie learning journey.

Resources

https://learn.protopie.io/start

https://learn.protopie.io/course/protopie-101

https://cloud.protopie.io/p/1a8b65c2398caca10872b720?ui=true&scaleToFit=true&enableHotspotHints=true&cursorType=touch&mockup=true&bgColor=%23F5F5F5&bgImage=undefined&playSpeed=1

IMPULSE #4 ProtoPie Crash Course

Introduction

Prototyping is one of the important phases in the design process to make ideas tangible. In one of our courses we were tasked to develop a ProtoPie prototype of one of our projects. We chose to develop our concept from the gamification course this semester.

I really loved this approach to high-fidelity prototyping. In my experience Figma sometimes limits what is feasible to showcase in a prototype. ProtoPie allows me to create realistic prototypes that are similar to a final product. In this research article, I will share how I went about getting started in ProtoPie, how I learned the software and how it differs from Figma.

What is ProtoPie?

ProtoPie is a tool that allows you to create interactive prototypes. The prototypes look, feel and behave like a finished software product, even though it is a no-code tool. Unlike static pages a prototypes can simulate real interactions and verify concepts. Button clicks and screen transitions can be simulated in Figma, but ProtoPie offers to simulate complex interactions like voice commands or tilt interactions, for example. You can use your device’s native sensor systems, such as the camera, microphone or even GPS.

How is ProtoPie different from Figma?

Both ProtoPie and Figma are design tools, but they serve different purposes. Figma is my choice when it comes to designing websites, user interfaces and simple prototyping. I love the collaborative aspect of Figma. In ProtoPie we struggled with collaboration. We had to save multiple versions of a file to “collaborate”. One person worked on one part of the prototype, while the other finished a different section and then we combined them. It was not optimal.

However ProtoPie is the better choice for dynamic high-fidelity prototypes. For example, I could build a password validation that shows different reactions depending on the input.

First project

When we started to get to know the software, we decided to do the ProtoPie 101 crash course. It was a really nice e-learning experience. Everything was easy to understand and the accompanying Figma Files were well prepared. The course starts with the basics, which is perfect if you’ve never worked with ProtoPie or other prototyping software beforehand.

I started by learning the core concepts – such as triggers and reactions. They form the basis for interactions. My first project was to import a design from Figma, create a simple button interaction and test it on my smartphone. The Figma to ProtoPie Plugin worked great. The Smartphone App for displaying prototypes was not as user-friendly but worked great after all. By the end of the lessons, I started to get to a hang of it. Most of the advanced features are self explanatory if you understood the basic concepts.

Working with native sensors

After exploring the basic concepts of triggers and reactions, we viewed the list of supported sensors and inputs. Working with native sensors from devices is where ProtoPie really opened up new possibilities to our projects. It means that you can not only create simple touch interactions, but also prototypes that respond to motion, voice, camera or other device inputs. This was a completely new to me, and it sparked new ideas on what is possible with prototyping for projects in my portfolio.

I loved that I could delve into these functions without knowing how to code. For example, I created a prototype that uses my smartphone’s camera input as part of the interaction. I was introduced to voice commands, allowing me to integrate voice control into a prototype – something that is becoming increasingly important in my opinion. I tested this in a simple example where an app responded to the word “start” and then triggered an animation.

After that we focused back on the project we chose to enhance with this ProtoPie prototype. It is an App that connects people through gamified experiences. We compared different sensors and decided to use the native iPhone compass for our game.

Conclusion

The tool helped us complete our concept and make the gamified experience tangible. The hands-on approach made it easy to learn step-by-step and feel successful early on in the learning curve. I want to continue to learn how to use variables and formulas to create more dynamic interactions. I would like to build prototype in the future that makes use of the “Send and Receive” feature. This way I could connect multiple screens together to create a multi-screen experience. I will keep this in the back of my mind for a future project. By the end of the course, I feel like I have a complete toolset in hand for prototypes.

Links

https://www.protopie.io

https://learn.protopie.io/course/protopie-101

Impulse 04 // Crash Course Protopie Part 2

ProtoPie 101 Crash Course | ProtoPie School

After completing the first half of the ProtoPie Crash Course, I was motivated to dive into the second half because of the quick learning and skills I had already gained in the first part of the course. With three more lessons this time about advanced techniques, I gained a deeper understanding of ProtoPie’s capabilities. The content this time was Conditional Logic and Triggers, Variables and Functions & a Wrap Up to summarise and review all the learning from the whole course.

The fourth lesson introduced me to conditional logic and advanced triggers. These features allowed me to create interactions that responded intelligently to user inputs. This was a significant step up from the basic interactions we learned earlier.

Conditional Logic

We started by creating a password validation interaction using conditionals. This exercise showed me how to add logic to prototypes without needing to write a single line of code. By setting conditions, I was able to create a prototype that checked whether a password met specific criteria and provided real-time feedback to the user.

Chain and Range Triggers

Next, we explored the Chain Trigger, which is used for creating navigation aids. I designed an interaction where tapping on different sections of a menu smoothly scrolled to the corresponding part of the page. The Range Trigger was another great too which I used to create an auto-play video carousel that responded dynamically as the user scrolled. Both triggers added a new layer of sophistication to my prototypes.

The fifth lesson was all about harnessing the full power of ProtoPie by using variables, functions, and components. These features gave me access to the possibilities of creating complex, yet manageable, prototypes.

Variables and Functions

I started by learning how to use variables and formulas to store and manipulate data within a prototype. This was a game-changer for me, as it allowed for dynamic interactions. For example, I created a camera focus point interaction where users could tap anywhere on the screen to adjust the focus dynamically. Using variables made the interaction feel incredibly realistic.

Custom Greetings and Smart Light Control

Next, I built a customized greeting interaction that displayed the user’s name based on their input. This feature demonstrated how ProtoPie could personalize experiences. We also designed a smart light control prototype where users could adjust the brightness and color of a light bulb. This exercise showcased how ProtoPie could simulate IoT interactions effectively.

Multi-Screen Smart Home Control

The highlight of this lesson was creating a multi-screen smart home control interface. By using components and the Send & Receive feature, we linked multiple screens together seamlessly. This exercise emphasized the importance of reusability and organization in prototyping complex systems.

The final lesson was a wrap-up session that consolidated everything we had learned throughout the course. It included a knowledge exam, which tested our understanding of ProtoPie’s features. I was happy to pass the exam and receive my certificate of the crash course in Protopie.

Helpful Resources

Before the course ended, we were provided with a lot of resources to continue our ProtoPie journey. These included detailed documentation, community forums, and example projects. Knowing that I have these resources to look up gives me confidence to tackle even the most challenging prototyping tasks in the future.

The second half of the ProtoPie crash course, like the first, was interesting and full of useful skills and possibilities for future prototypes. It opened my mind to not only think about how to design interactions that are both functional and intuitive, but also that I am now able to test and prototype them myself. The hands-on exercises, as in the first part, allowed me to experiment with the more advanced features and gain practical experience, which, as I said before, is the only way I really learn, by trying things and doing them. By the end of the course, I felt equipped to create prototypes that go beyond static designs and truly mimic real-world interactions. Because ProtoPie’s is so easy to use, I think it will be my go-to tool for prototyping. It is also a good element for my Master’s thesis, in which I plan to connect the analogue and digital worlds in a calmer way by creating new ways of interacting between them. As I plan to build and test a physical prototype in the thesis, I will most likely need some sort of digital layer, which I now feel able to realise, or at least build a mock-up of.

Impulse 03 // Crash Course Protopie Part 1

ProtoPie 101 Crash Course | ProtoPie School

For this semester and next, we have been given the opportunity to use Protopie with a full licence as part of our studies. Because this was introduced in a subject where we could choose the topic, we wanted to work on ourselves, and the topic I chose was a group project to further develop a game we made in the first semester of the Masters. I decided to use two of my impulse blog posts to learn how to use and prototype with Protopie. Fortunately, Protopie offers a comprehensive crash course, divided into six lessons, to learn and master many of the possibilities it offers. As the course is quite extensive, I have split it into two parts, each covering three lessons of the course. So here is the first half of the course on the basics, interactive transitions & sensor-based interactions.

The course started off with a comprehensive introduction to ProtoPie. The first lesson covered the tool’s three main purposes: to create, test, and share prototypes. This was perfect for me, as I’d only ever worked with Figmas prototyping tools before. ProtoPie promised to enable more dynamic and realistic interactions.

Creating Prototypes

It started with learning how to set up our projects. The process was straightforward. Once the project was ready, we explored how to seamlessly import designs from tools such as Figma, Adobe XD or Sketch. Next, we were introduced to the basic features of ProtoPie. I learned how to create interactions by simply dragging and dropping elements. The interface was intuitive, even for someone with limited experience of advanced prototyping. Creating interactions felt like building with digital Lego – any action or trigger could be linked to create a seamless process.

Testing and Sharing

Once my prototype was ready, the next step was to test and share it. ProtoPie allows us to view our prototypes directly on devices such as smartphones and tablets, which made them tangible. I could see how the designs would work in real-life scenarios. Sharing was just as easy. I uploaded my project to the ProtoPie Cloud, which made it easy to collaborate with others. Another good feature is the interaction recordings. These allow you to document specific interactions. ProtoPie also has the functionality of Interaction Libraries, which allows teams to standardise design components. This can certainly save a lot of time on larger projects.

In the second lesson, it was time for hands-on with creating various types of interactions.

Screen Transitions

It started by teaching how to prototype automatic, semi-automatic, and fully custom screen transitions. I particularly enjoyed working on custom transitions because they allowed me to design interactions tailored to specific design case.

Scrolling and Paging

Next, the course dived into scrolling and paging interactions. I had always struggled to make scrolling interactions look good or useful in Figma. In ProtoPie the results were realistic, exactly like the scrolling you’d expect in a native app.

Sliding Menus

The last part of this lesson was designing sliding menus. We explored three different ways to create them, ranging from simple swiping gestures to more complex interactions that combined multiple triggers.

The third lesson took ProtoPie’s capabilities to the next level by introducing sensor-aided interactions. This feature truly sets ProtoPie apart from other prototyping tools because it enables designers to use device sensors without needing any coding knowledge.

Using Device Sensors

The workshop started with an introduction to using a phone’s camera in prototypes. I created interactions where the camera’s feed became part of the design. This was particularly useful for scenarios like augmented reality apps or interactive tutorials.

Input Fields and Native Keyboards

Next, the course explored prototyping with input fields and native keyboards. This feature was a pleasant surprise, as it allowed me to create realistic forms and search bars that behaved just like the ones in real apps. I can already see how this could improve user testing sessions, as participants would interact with the prototypes in a natural way.

Voice Interactions

The final part of this lesson focused on voice interactions. ProtoPie made it easy to incorporate voice commands and responses into prototypes. This feature opened endless possibilities for designing interfaces for voice-activated devices or accessibility features. I was amazed at how simple it was to implement this functionality.

The first three lessons of the ProtoPie crash course already showed a lot of possibilities in prototyping. Each lesson built on the previous one, gradually introducing more complex features. I appreciated the hands-on approach, as it allowed me to apply what I learned immediately, which is the best way for me to learn and retain things.