Shade: AI powered VFX tool to increase productivity

 

AI based tool Shade for vfx

AI based tool Shade is for our AVGC (Animation, VFX, Gaming, Comics) industry. It uses a variety of AI techniques, including machine learning and natural language processing, to automate tasks. This frees up the artists to focus on the creative aspects of their work. Overall, it is a productivity tool.

As on today’s date, they have integration with following 3D Animation and Visual Effects softwares.

  • Maya
  • 3ds Max
  • Blender
  • Houdini
  • Unreal Engine
  • Unity
  • After Effects
  • Nuke
  • FCP
  • Premiere Pro
  • Lightroom
  • Photoshop
  • Illustrator
  • Affinity Designer

The Virtual Assist took an exclusive interview of Brandon Fan, CEO, Shade. In it, we will discuss how Shade works, its pipeline, benefits and limitations, and how it can be used to improve the production process. We will also talk about the future of Shade and how it could revolutionize the AVGC industry.

The excerpts of the interview is as follows.

1. What are the key features of Shade that make it different from other AI-powered file explorers?

Shade is designed from the ground up to literally feel like your finder and file explorer got 10xed. Our understanding of almost any type of file definitely differentiates us. We’ve spent great detail trying to make sure we support the various 3D assets across different softwares like Blender, Houdini, Maya, Unreal, along with a variety of image and video containers and codecs too.

I think what also separates us is that we use a multi-modal approach. Specifically, we’ve designed AI to be really good not at “everything” but really good at a variety of specific assets. For example, we’ve built out an entire audio neural engine from the ground up that can search samples and SFX while keeping our “visual” neural engine separate.

We’re constantly adding new modalities too (so expect something specific for 3D and for graphics soon)

2. How to use AI based tool Shade in the existing pipeline?

Shade fits directly into every VFX pipeline as we integrate directly with Shotgrid (all metadata and information is automatically transferred to shade and indexable) and we’re actively working on shipping an F track integration too. Aside from our application interface (which works on Linux, Windows, and Mac), we’ve launched a headless API that VFX studios and enterprise customers can directly hook into our AI search and metadata organizer (adding their own metadata to the search index as well).

Simply send over a preview or turntable of an asset, and we’ll be able to index it.

3. How does Shade use AI to improve the user experience?

We’ve all been there as individuals trying to organize our compositing libraries, or terabytes of footage. If you talk to any VFX studio or production studio, you begin to quickly see the rising amount of curated assets per project, items that you want to reuse, and just assets that you want to organize and search.

Shade makes that easy because we integrate with your native file explorer and operating system, your network drives (and soon cloud storage), and your NAS systems so you and your team can stay organized, search for the files and assets you need, and quickly drag it into your pipeline or NLE.

4. What are the benefits of Shade?

It is an AI-powered file explorer for creative professionals. Faster access to the assets you care about. Better workflows and methods of organization. Trust that you will literally never lose an asset again. Since Shade knows where all your assets are across all of your drives and locations, and has the metadata, you’ll most definitely be able to find it.

Plus, we actually have a better file search than what native finder or file explorer can even search.

5. What are the limitations of Shade?

Like any AI, the AI isn’t perfect. It does have a tendency to misclassify or not produce accurate enough descriptions for various exotic assets. But I think the plus of being able to search for what you need, and getting the right results 95% of the time, far outweighs the 0% of times you would use with no search engine at all. Plus, we also make it incredibly easy for users to change the metadata, tags, and descriptions as needed.

In the future, for enterprise customers, we offer solutions to custom train models an deploy them on their infrastructure and their IP in mind.

6. What are the challenges of developing an AI powered file explorer?

The biggest issue is getting our AI to understand all the unique files and assets that people create. The second is shipping it in a way that makes sure privacy and security are paramount. We’ve designed our application to be able to ship all of our models to the user’s company and run locally on their devices. Very contrary to majority of AI startups. Why? Because we value user privacy.

We’ve paid special attention to reducing our model sizes so they can run on consumer grade hardware.

7. What machine learning algorithms does Shade use to power its features?

We separate our AI into two core engines: One is Shade’s Visual Neural Engine, which is able to handle and understand anything visual (images, video, 3D assets), and Shade’s Audio Neural engine (SFX, Samples). We’ve built out proprietary models and pipelines that are able to extract meaningful metadata and features along with a set of tags that most accurately encompass each unique type of asset that is optimized for searchability.

8. At the time of this publication, how any softwares are supported by Shade?

You can take a look at all Shade’s integrations. It’s safe to say it’s a lot, and it’s continuously growing.

9. What are the plans for future development of Shade?

Our vision is to automate the mundane processes of creativity, never to replace it. We believe in a future where if we can build the copilot for creators, automate frustrating but necessary processes, creators can maximize their time efficiency and get to the vision they want faster than ever. We’re actively leveraging modern LLM technology to build out new way of interacting with our app.

We’re also looking to partner with various stock libraries to apply Shade’s neural search so people can find the assets they need faster.  Finally, we’re looking at a backup service that retains all searchability and metadata so your assets will be stored permanently (and findable)

10. How does Shade handle large file sizes?

Shade does this in multiple ways – We’ve optimized the time it takes to index a single file immensely, we’ve threaded out a bunch of time intensive operations and reduced expensive operations to next to none. Of course, larger files will take longer times to process, but in the end, Shade is processing frames of data. Ultimately however, there are limitations.

If you’re trying to get Shade to index a long cycles blender scene, it’s going to struggle as it needs a visual representation. If you have that visual representation on hand, then it’s very fast! To benchmark, I gave it over 35GB of footage, each ranging to about 500MB to 1GB in Log 4K and it was able to index it in about 15-25 minutes.

11. How does Shade integrate with other AI-powered applications?

Shade currently has a Stability AI integration to allow users to generate concept art directly from their assets. Shade has also built out a rotoscoping tool that has a proprietary spline to alpha matte and alpha matte to spline technology that we hope to combine with other systems like video stabilization, upscaling, denoising, and more, all as one subscription with asset management.

12. How is the support system this AI based tool?

Please check this links:

We launch about every week and we’re hoping to launch public within the month. We’re backed by Contrary Capital and Rough Draft Ventures but we’re always looking to expand the team! We’re looking for full stack engineers and backend engineers with experience with video and 3d graphics.

13. What are the security measures in place to protect user privacy?

Everything is done on the client’s computers and servers. This was a requirement from the very beginning when we designed the architecture. We talked with almost every VFX studio and each had the requirement of it having to work in a shutdown environment. Similarly, talking with creators and individuals, the value of privacy was paramount. No one wanted to have to upload all of their terabytes of footage and materials to the cloud just to index and search it.

Neither did they want to give up their IP. That’s why everything happens locally on the user’s computer. No data is ever sent to Shade unless explicitly stated and granted by the user (for example for video and audio transcription, the model is too big to be handled on current client computers, so the audio is sent to our cloud servers for indexing)

14. Let us know about Shade’s roto tool.

You can refer it as a masking engine which is powered by AI tracking system. Without losing data, it seamlessly converts between spline to mask and vice versa.

Currently, it is integrated with After Effects and Nuke. You can use it offline.

15. Why the name ‘Shade’?

We wanted something that was “edgy” and evoked something creative. Shade, just like different shades of color sort of made sense. Ultimately, we want to be the cool kid on the block, but more importantly, we want to be one of the most useful tools on your dock and hope that Shade becomes a staple in the VFX and creator space.