Technical Manifesto

Technical Manifesto of CraftGen

TLDR; is an advanced AI platform utilizing graph architecture for clarity and scalability, and the Actor Model for concurrent operations. It features an event-driven architecture for reliable data processing and is built on browser web technologies for universal access and ease of use. Key functionalities include dynamic tool and workflow generation robust offline capabilities for enhanced data security in sensitive environments, and future integration with a desktop application. is designed to simplify AI interaction for both technical and non-technical users, enhancing productivity and problem-solving capabilities., aligning with industry standards through JSON schema, integrates with any API using OpenAPI spec, as utilized by platforms like Zapier. It automates various tasks, ranging from data analysis to content management, streamlining workflows and enhancing efficiency.

Motivated by the need for an accessible AI platform that simplifies complex AI interactions for a diverse user base. It addresses the gap between advanced AI technology and practical usability, aiming to provide a flexible, secure, and intuitive platform. seeks to empower users, from developers creating custom AI solutions to businesses automating routine tasks, by providing an easily navigable and adaptable toolset. This initiative is driven by the goal to make AI technology a seamless part of everyday problem-solving and innovation.

Why Graph Architecture?

Choosing the right architecture is foundational to the success of any AI-driven platform. In developing, we focused on a graph architecture for its flexibility. Representation of complex systems. This approach enables clear visualization of the relationships and dependencies between various components, agents, and workflows, aligning well with natural human cognition.

Graphs offer inherent scalability and flexibility, crucial for a platform designed to evolve and cater to a diverse user base. This architecture optimizes AI workflows, allowing us to identify paths for performance enhancement and parallel processing, resulting in faster and more efficient task execution.

Moreover, this graph structure aligns with our human-centric design principles. It mirrors the way humans conceptualize structures and processes, enhancing the platform's accessibility and user-friendliness.

An integral aspect of's design is its support for cyclical dynamics in AI agent behaviors. Diverging from traditional linear workflows, these cycles facilitate continuous learning and adaptation in our agents, mirroring human problem-solving methods. As a result, our agents can handle complex and evolving tasks with advanced, human-like problem-solving capabilities. By integrating a graph architecture, stands at the forefront of AI technology, offering sophisticated yet intuitive solutions for our users.

What is Actor Model?

The Actor Model, a robust framework with origins in the 1970s, has become a cornerstone in modern concurrent and distributed system design, utilized in frameworks like Akka and languages like Erlang. Its core tenet is encapsulated in the phrase 'everything is an actor,' highlighting its approach where actors are the fundamental units of execution.

In this model, each actor in the system operates independently, similar to performers in an unconventional theater. They execute tasks, manage their internal state, create new actors, and communicate predominantly through message-passing. This decentralized form of interaction mirrors how actors in a play might exchange messages backstage without direct interaction, each following their own script yet contributing to the overall narrative.

Key to the Actor Model, and indeed in its implementation in, is the principle of non-interference among actors. Actors do not directly manipulate each other's state; instead, they influence through messaging. This is akin to exchanging notes in a lively, chaotic theater, where each performer, or actor, reacts based on the messages received, continuously adapting to the unfolding drama.

This approach brings several advantages to

  1. Scalability and Concurrency: Like a play with multiple independent actors, can handle numerous operations simultaneously, making the system highly scalable and efficient.
  2. Resilience: The system is resilient like a troupe that continues the show even if one actor forgets a line. If one actor in fails, it can be restarted or replaced without disrupting the overall system.
  3. Modularity: Each actor operates in isolation, akin to having their own mini-script. This modularity makes the system easy to maintain and update.
  4. Efficient Communication: Just as actors in a play pass notes without interrupting the performance, actors in communicate asynchronously, enhancing overall system responsiveness.

Therefore,, leveraging the Actor Model, resembles a dynamic theater where each actor independently and efficiently contributes to a complex, interconnected performance, driving forward the narrative of advanced, AI-powered solutions.

Event Architecture in

In the section on Event Architecture within, we delve into how this approach not only enhances system functionality but also offers numerous benefits in terms of debugging, modularity, and data analysis. Here's an outline touching upon the key points:

1. Ease of Debugging:

Traceability: Event architecture in records each action as a discrete event, making it easier to trace the sequence of actions leading up to a certain state. This traceability is invaluable for debugging, as it allows developers to pinpoint the origin of an issue by reviewing the event log.

Evidentiary Nature: Since events in this architecture are immutable (updates and deletions are not allowed), the event log serves as an incontrovertible record of system activity. This feature is crucial when diagnosing problems, as it provides a reliable and unaltered history of events.

2. Modularity:

Decoupled Components: Event-driven systems inherently promote modularity, with components reacting to event notifications. This decoupling means that changes in one part of do not necessitate changes in others, simplifying maintenance and updates.

Scalability: Modularity in event architecture also enhances scalability. Components can be scaled independently in response to differing loads, making the overall system more efficient.

3. Event Sourcing and System Understanding:

Understanding System Evolution: Event sourcing is a key aspect of event architecture, where the state changes are captured as a series of events. This approach allows developers to understand not just the current state of the system but also how it arrived there, providing deep insights into system behavior and evolution.

Historical Data Analysis: With every state change recorded as an event, has a comprehensive historical dataset that can be analyzed to understand trends, system usage patterns, and more.

4. Recoverability and State Reconstruction:

Event Replayability: Events in are recoverable, meaning they can be replayed to alter the system state or fix issues. This capability is crucial for system recovery and error correction.

State Reconstruction: The ability to replay events also allows for the reconstruction of past states. This is particularly useful in scenarios where understanding the historical state of the system is necessary for analysis or debugging.

5. Insights from Event Data:

User Behavior Analysis: The event data in can be mined for insights into user activity and preferences. Analyzing this data helps in enhancing user experience, tailoring features to user needs, and making informed decisions about future enhancements.

Predictive Analytics: The accumulation of event data over time opens avenues for predictive analytics, where future trends and user behaviors can be anticipated, allowing for proactive system enhancements.

In summary, the event architecture in not only strengthens the system's robustness and reliability but also provides invaluable tools for debugging, scalability, and data-driven decision-making. This architecture is integral to delivering a responsive, user-centric platform that continually evolves based on user interactions and system performance.

Universal Format and Compatibility: "Build More. Break Less. Empower Others.”

  • Widespread Industry Adoption: The choice of JSON schema aligns with industry standards, as it's a format utilized by major players like Zapier, GitHub, Microsoft, Replicate, and OpenAI. This universal adoption ensures seamless integration and compatibility, facilitating easier collaborations and extensions.
  • Encouraging Innovation: By using a format familiar to many developers and organizations, empowers users to build more efficiently and with less risk of compatibility issues. This approach not only streamlines development but also encourages innovation within the platform.

Flexibility and Dynamism:

  • Adaptable Architecture: JSON’s inherent flexibility allows to dynamically adapt its structure. This means that as new features or requirements emerge, the platform can evolve without major overhauls, ensuring that it remains cutting-edge and user-centric.

Structured Data Validation:

  • Ensuring Data Integrity: JSON Schema is pivotal in maintaining consistency, validity, and interoperability of JSON data at scale. By adopting JSON Schema, ensures that the data across its platform is structured and validated, maintaining high standards of data integrity and reliability.

Facilitates API Documentation and Testing:

  • Streamlining Development Processes: JSON schema aids in clearly documenting API interfaces, making them easier to understand and test. This clarity is invaluable for developers, especially Super Users in, who may be working on custom integrations or developing new functionalities.

Dynamic and Extensible Node Representation:

  • Dynamic Nodes as JSON-Schema: In, each node is represented as a JSON schema, yet retains the flexibility to be extended. This approach means that while the platform has a consistent and understandable base structure, it also offers the freedom for users to tailor and expand it as per their specific needs and creative ideas.

By leveraging a JSON schema-based approach, positions itself as a flexible, interoperable, and developer-friendly platform. It aligns with industry best practices while providing the necessary tools and structures to ensure data integrity, ease of integration, and continuous innovation.

Generative AI for Extending Abilities of the Platform

At the heart of's innovative approach lies its capability to dynamically generate new nodes, tools, and workflows, thereby revolutionizing problem-solving within the AI domain. This dynamic generation is largely powered by a JavaScript-based code interpreter, which serves as a cornerstone for on-the-fly tool creation.

JavaScript-Based Code Interpreter for Tool Creation

  • On-the-Fly Tool Generation: The JavaScript code interpreter in allows for the immediate translation of AI-generated ideas into executable code. This process involves interpreting user inputs or AI suggestions and converting them into functional tools that can be immediately deployed within the platform.
  • Real-Time Problem Solving: This capability enables to address user queries or problems in real-time, rapidly developing custom solutions. For instance, if a user needs to analyze a specific set of data, can generate a custom tool to extract, process, and visualize this data, all in a matter of moments.
  • Enhancing Flexibility and Customization: The use of a JavaScript interpreter enhances the platform's flexibility, allowing for a wide range of tools to be created that cater to diverse needs, from data analysis to workflow automation.

Utilizing Graph Architecture for Efficiency and Structure

  • Single-Step Tool Creation: Leveraging its graph architecture, ensures that the tool creation step is executed once for each unique problem. Once a tool or workflow is created for a particular type of task, it can be reused or modified for similar tasks in the future, significantly boosting efficiency.
  • **Structured Output and Speed: **The graph architecture also contributes to a more structured output. Each node in the graph represents a specific operation or tool, making the workflow easy to understand and follow. This structure, combined with the speed of dynamic tool creation, results in a highly efficient system that quickly delivers clear, actionable results.
  • Scalability and Adaptability: The graph-based approach allows for easy scalability and adaptability of workflows. As new tools or nodes are created, they can be seamlessly integrated into existing workflows, enhancing the platform's overall capabilities without disrupting established processes., by dynamically generating tools and workflows through a JavaScript-based code interpreter, and utilizing its graph architecture for structured and efficient operations, stands as a testament to the power of modern AI platforms. This technical sophistication not only accelerates problem-solving but also opens new avenues for creativity, customization, and innovation in AI-driven solutions.

Embracing Browser Web Technologies Over Server-Dependent Models

In this section, we'll contrast's use of browser web technologies with competitors who rely on server-based environments, emphasizing the benefits of our approach, including offline functionality and future plans for a desktop app.

  1. Independence from Server Environments:

Ease of Access vs. Docker Setups: Unlike competitors who require users to set up Python environments and potentially manage Docker instances,'s browser-based technology allows immediate access without complex setup processes. This distinction is crucial for user convenience, as it removes the need for server spin-ups and intricate configurations.

Lower Barrier to Entry:’s approach significantly lowers the technical barrier to entry, making the platform accessible to a broader audience, unlike Python/server-based solutions that often require more technical know-how.

  1. Enhanced Offline Functionality for Secure Environments.'s offline capabilities significantly bolster data privacy and security, ideal for users in sensitive or isolated environments. It supports local Large Language Models (LLMs) to ensure powerful AI processing without internet connectivity, crucial for secure, air-gapped, or restricted-access settings. This feature guarantees data remains within a controlled environment, addressing privacy concerns and enabling uninterrupted productivity in any setting.

  1. Planned Desktop Application:

Enhanced Interaction with Local Tools: While thrives in the browser, plans for a desktop application will allow deeper integration with the user's file system and local tools. This app will enable users to control their browser for tasks like automated research, offering a level of interaction that purely server-based solutions can't provide.

Seamless Transition Between Environments: Users will have the flexibility to operate in both a desktop and browser environment, ensuring seamless workflow transitions whether they are online or offline.


Last updated on