Software, data and business teams build solutions together, seamlessly combining their technical and domain expertise
Trusted by financial and telecom companies to handle heavy data processing and decision making
Business Team
Apply Your Domain Knowledge with Zero-Delay
| Express decision algorithms with self-explanatory flow diagrams |
| Use spreadsheet-like formulas for even the most complicated data transformations and boolean conditions |
| Verify your ideas instantly using one-click deployment and testing functionalities |
Data Team
Productionize AI for Demanding Workloads
| Focus on ML insights, not ML model deployment |
| Use RAG, add model pre and postprocessing and ensemble multiple models |
| Prompt foundation models, infer registry-stored Python pickles, or utilize standardized weight formats. |
Software Team
Engage Stakeholders in Data Processing without Compromising Technological Advancements
| Set up integrations and let domain experts make use of the data |
| Use the power of Flink while keeping it under the hood |
| Adjust to your specific needs - add UDFs and specialized components |
Key Features
Ease of Use
Focus on data transformations and flow logic only
Apache Flink and Kafka complexity hidden from developers
Algorithm authored from highly flexible prefabricated functionality blocks
Algorithm visualized as an interactive flow diagram
Powerful, yet easy expression language for data transformations and flow control
Schema-aware autocompletion and validation, like in the developer tools
Testing and debugging with visualization of data and algorithm behavior
Real-time monitoring and metrics
AI-powered Assistant
One-click deployment
Deployment Flexibility
Run in Nu Cloud or on-premises
Kubernetes or OS-based installations
Nu Cloud Managed Flink, Ververica Platform, or Bring Your Own Flink
Fit for your use case
Unifies both streaming & batch processing
Kafka® source and sink interfaces, integrates with Confluent® Cloud, Azure Event Hubs® and Aiven®
Databases & data lakes sources and sinks via Flink's connectors
REST (OpenAPI) enrichments
Database (JDBC) enrichments
Generalized decision table for complex IF-ELSE logic
ML models inferring enrichments
Agentic AI: LLM execution, MCP server integration, embeddings management via vector stores integration
Rock-solid fundamentals of Apache Flink
Scales horizontally to millions of events per second
Built for low latency
Stateful processing: time windows, state management
Exactly-once processing
Efficient Avro binary serialization
Built-in checkpointing for fault tolerance
Nussknacker is a graphical tool to define, deploy and monitor Apache Flink jobs. Job logic is expressed by a graph, with SpEL used for data transformations and boolean conditions.
Nussknacker supports various data sources - Kafka streams, files, databases, HTTP APIs, and many others, either natively or via Flink connectors.
Use Cases
Real-time Marketing
Communications with customers in real-time, providing event-driven offers and actions
RTM Automation
Fraud Management
Mitigating fraud by running detection algorithms on network or device signals
Fraud Monitoring
Next Best Action
Assisting the Point Of Sale, displaying suggestions about what to offer and how to proceed with a customer
Recommendation System
Customer Data Processing
Decisioning on dynamic customer data in
- dynamic pricing
- order status management
- instant credit scoring
Internet of Things
Automating actionable data in
- predictive maintenance
- inventory management
- smart devices
Gaming Engagement
ML Models Deployment
Infer Machine Learning models from within complex decision algorithms
ML Inference
Offer
Freemium
Hosted by Nussknacker
Quick solution for straightforward yet demanding data streaming tasks without exhausting investment decisions
Pro
Hosted by Nussknacker
Ready-to-use collection of features and integrations for advanced data environments with affordable infrastructure maintenance expenses
Enterprise
Self-hosted / On Premise / BYOC / by Nussknacker
Extensible tool fitted for superior technology stacks where unique data integrity is required
Blog
Building Iceberg Lakehouse with AWS Glue
Leverage Nussknacker to build a modern data lakehouse on S3 using Apache Iceberg and AWS Glue.
Understanding Event Time in Nussknacker
Learn why Event time is crucial in stream processing and how Nussknacker leverages it for reliable stateful computations.
Generation-Augmented Retrieval (GAR) for Product Recommendation
How to quickly launch a solution based on LLM/Foundational Models to engage customers.
Next Steps
see the demo in action
try it yourself
feel free to contact us