Find products trusted by professionals in your network
See which products are used by connections in your network and those that share similar job titles Sign in to view full insights
Software used to process data continuously as events occur. - Manage, store, and analyze streaming data - Visualize data flow and validate data and its delivery - Extend the reach of existing enterprise assets using connectors to multiple core systems
26 results
Cribl Edge™ Your not-so-secret agent for vendor-neutral, unified collection. Cribl Edge is a vendor-neutral, intelligent agent designed for the variety and scale of today’s modern architectures. With a unified telemetry collection system, you can have hundreds of thousands of agents at your fingertips to automatically discover and collect data from your Windows, Linux, and Kubernetes environments. Featuring a rich UI, centralized fleet management, and seamless upgrades, it’s time to transform your agent management.
Cribl Stream™ Get the data you need, in the format you want, wherever it needs to go. Collect, reduce, enrich, and route data in real time— without blowing your budget. Cribl Stream is your go-to solution for managing the variety and volume of telemetry data without blowing up your budget. Stream is the industry's leading observability pipeline, empowering you to collect, reduce, enrich, and route telemetry data from any source to any tool in the right format. Whether you're dealing with megabytes or petabytes, Cribl Stream scales with ease, giving you the flexibility and control to handle any data, your way.
Real-time data integration built for the cloud. The Striim platform makes it easy to ingest, process, and deliver real-time data across diverse environments in the cloud or on-premise. Striim can ingest real-time data from a variety of sources, including change data from enterprise databases, and rapidly deliver it to on-premise or cloud systems. While the data is moving, it’s easy to process this data, using simple SQL based transformations, to get it into the correct form for the target. Striim provides enterprise grade real-time integration, in a scalable, reliable and secure platform, with real-time monitoring and alerts to provide visibility into your continuous data pipelines. Businesses use Striim’s capabilities to migrate from on-premises databases to cloud environments without downtime, keep cloud-based analytics environments up-to-date, and feed real-time data to their on-premise and cloud data lakes and messaging systems for timely operational decision making.
Hazelcast Platform is a unified real-time data platform that enables companies to act instantly on streaming data. It combines high-performance stream processing capabilities with an enterprise-grade cache to let you easily enrich streaming data with historical context to continuously uncover hidden actionable insights. This unified architecture reduces the number of moving parts in your infrastructure and removes much of the complexity of developing and deploying sophisticated business-critical applications.
See which products are used by connections in your network and those that share similar job titles Sign in to view full insights
The StreamSets platform provides modern data integration capabilities that deliver analytics-ready data through resilient and repeatable pipelines. The StreamSets platform eliminates data integration friction, enables innovation with centralized guardrails, and insulates data pipelines from unexpected change, allowing teams to unlock data - without ceding control - to enable a data driven enterprise.
Managed Service for Apache Kafka® Apache Kafka® is the leading data streaming technology for large-scale, data-intensive applications. Streamline your Apache Kafka cluster deployment in the cloud to obtain production-ready and fully managed service in just about 10 minutes. Get more than just Kafka with DoubleCloud: - Enhanced monitoring: We offer detailed metrics on topics and replication, alongside real-time infrastructure stats such as CPU, memory, and network usage. - MirrorMaker and S3 sink Kafka connectors: Easily replicate data between clusters using Kafka Connector to MirrorMaker. - x86 and ARM support: Get a 20% price/performance boost and use Apache Kafka on Arm-based instances on AWS with no extra work. - Built-in integrations: Utilize Data Transfer, which comes with over 20 connectors, to transfer data from various sources to Apache Kafka and vice versa. - Schema registry and REST Proxy support.
DeltaStream is a unified serverless stream processing platform that integrates with streaming storage services - it acts as the compute layer on top of your streaming storage. It provides functionalities of streaming analytics (stream processing) and streaming databases along with additional features to provide a complete platform to manage, process, secure and share streaming data. DeltaStream provides a SQL based interface where you can easily create stream processing applications such as streaming pipelines, materialized views, microservices and many more. It has a pluggable processing engine and currently uses Apache Flink as its primary stream processing engine. DeltaStream brings relational database concepts to the data streaming world, including namespacing and role based access control enabling you to securely access, process and share your streaming data regardless of where they are stored.
PortX provides complete control of high-performance observability data pipelines from multiple sources at any scale with a single point and seamless experience, freedom from vendor lock-in, and controlled data routing of only specific data to any destination by need. PortX includes AI/ML log-data collection, automated parsing, pattern detection, aggregation, advanced filtering, masking and encryption, and enrichment. PortX reduces logging costs significantly.