Flux Upgrades Its Copilot Tool Adding Multi-Modal AI Features

December 21, 2023 by Duane Benson

Released today, the upgrade brings improvements—including image capability—in Flux AI Chatbot giving designers a smarter virtual “engineering partner” to help with electronics PCB designs

AI pundits have been saying that AI can help engineers with their design tasks. The two big questions that come along with such a proclamation are "what does that mean?", and "how can I do that?" Flux has a solid answer with the latest release of their web-based electronics design tool, Flux Copilot.

The Copilot capability, originally launched in April, 2023, has an integrated AI chatbot based on an engineering oriented large language model (LLM). It has the aim of giving designers a virtual engineer co-pilot. Claiming it as the “first multi-modal AI for hardware design,” Flux has today released the new version that includes several significant AI upgrades, including an ability to recognize and work with images.


Flux design tool with AI copilot

Flux design tool with AI Copilot


Flux Co-Founder, Lance Cassidy, describes the newest version of the AI Copilot as “an AI assistant for designing hardware.” He sees AI as having the potential to touch every part of the design cycle, from idea to system architecture all the way to the tedium of design documentation. With their new release, they are taking their AI tool and making it an even better AI assistant than the initial version.

AI tends to fall into one of two functional modes: generative, which creates original content, and interpretive, which analyzes data and delivers an interpretation as the result. The multimodal AI in the new release of Flux Copilot does both. It can interpret human input, reference bitmap images and language model databases, and it can generate original design content based on user prompts.

Flux is, from the start, a team-based collaborative system. Collaboration is integral to the design process, and that collaboration in Flux consists of both human partners and the multimodal AI chatbot. Multiple team members can be working on the design simultaneously and each can be utilizing the AI chatbot for design guidance, documentation, reference and fulfilling mundane tasks.


Finding the Important Information

One of the big challenges facing an engineer deep in a product design is keeping track of the wealth of design information. Each component has a data sheet with key information, but these data sheets don’t follow a standard format. Some are just a few pages while others can be dozens or more pages long. The newest Flux AI chat features can dig through datasheets and find the right information for you. 


Flux Copilot design advice

Flux Copilot design advice


Along with the data sheets, the new Flux understands the bill of materials. It knows the components, the cost and the project requirements. Such broad understanding can help with the all-too-common tradeoffs between performance and cost. The AI tool can easily handle tasks like ensuring all components meet the design temperature range, or that the overall power budget is met for mobile designs.


Image Recognition and Interpretation

Dubbed “Copilot Vision”, powerful image recognition and interpretation is new in this version. You can feed the AI chatbot a block diagram, for example. The AI will recognize any text in the image and then break down the block diagram into major sections. Based on further questioning, it will recommend components for each section.


Copilot Vision enables the tool to recognize and pull information from a block diagram like this.

Copilot Vision enables the tool to recognize and pull information from a block diagram like this.


After your design is complete, you can compare the original block diagram to your finished design as a part of the design review. It will compare section by section and verify that you have hit your mark. The AI will also give recommendations. For example, it may respond with a recommendation to add ESD protection to input lines in the project.


Design Advice and Wiring Assistance

With the new version, design collaboration has entered the AI age. This is where the generative part of multimodal AI comes into play. For example, AI Copilot can compare two component datasheets and make some decisions for you.

In an example given by Flux product expert, Kerry Chayka, they used AI Copilot to connect a microcontroller to a TFT display module over the SPI bus. With the command: “Can you connect U1 and U2?”, the software found that both chips have an SPI bus and wired-up SDI, SDO, SCLK, chip select, power and ground.


Security and Avoiding “AI Hallucinations”

With AI tools becoming more accessible to more people, Chayka pointed out, there is a phenomenon arising called “AI hallucination.” What this means is that sometimes AI tools – and this applies to all generative AI tools—go a little too far and give information that is not as accurate as we would like. Flux has reduced the problem greatly with careful language model training.

The Co-pilot also makes it easy to verify any information it gives. “If it says, hey, I got this information from page 95 on the data sheet, it'll link you to page 95 on the data sheet in the section that it referenced.” That built-in verification allows a designer to ensure that the AI is behaving, and it can raise their overall confidence level in the system.

Another common AI concern is regarding confidentiality. No one wants their proprietary design used in a global language model. Flux keeps private data private so nothing proprietary will be exposed to the outside world. Team members have full access, but the proprietary content is secured from non-authorized users and does not go into general language model development.


Where’s This Going? What’s Next?

Summing up, Cassidy explains what’s at the heart of the company’s efforts with Copilot. “Why are we doing this and where is it all going?” he says. Unlike software coding, hardware is not completely text-based. “In circuit design, there's a lot of knowledge that is visual—where photos, diagrams, all these kinds of visual assets are useful. So that's why we're prioritizing that kind of interface.”


“We're trying to enable basically whatever the most natural method is for engineers to interact with the AI. That’s what we want to enable. There's tons of different multimodal means like text, images, video, and other types of documents. If it's pdfs or projects or anything like that, you can now start taking more of the workflow that’s not captured within tools and start leveraging them. That's where all of this is going.”


All images used courtesy of Flux