Categories

Versions

RapidMiner AI Hub -- for teams!

It's a mantra in computer science that you should use "the best tool for the job". This observation is often followed by the comment that the best tool for the job is the one you know well, because with it you can quickly and efficiently put your skills to use.

But what's the best tool for a team? Ideally it should be flexible enough to encompass a variety of tools, while unifying them onto a single platform to boost productivity. Maybe you can recognize yourself in one of the following categories:

  • Do you like Python? RapidMiner AI Hub includes JupyterLab and smoothly integrates with Python workflows.
  • Already using RapidMiner Studio? RapidMiner AI Hub provides a shared platform, running on more powerful hardware, to enhance the team's productivity.

RapidMiner AI Hub is a scalable, secure, easy-to-use platform. It provides a single shared workspace for team collaborations, with numerous productivity-enhancing features:

It's one thing to create a lot of models, quite another to deploy them to get useful predictions! The data workflows that your team creates will be more useful if the output can be consumed in a convenient fashion. RapidMiner AI Hub provides Panopticon and Web API endpoints.

Continue reading to learn more, or investigate the items in the side menu.

Create a data workflow / model

RapidMiner provides two different tools for creating a data workflow: RapidMiner Studio is a standalone product, while JupyterLab is a service provided by RapidMiner AI Hub.

  • RapidMiner Studio is a visual workflow designer for ETL and model building
  • JupyterLab is a user-friendly data interface for users of R, Python and Anaconda

RapidMiner Studio

RapidMiner Studio is a visual workflow designer that enables more complex data flows, including ETL and model building. RapidMiner AI Hub enhances RapidMiner Studio by providing shared repositories for models and processes, batch jobs, scheduling, and project management.

../studio/getting-started/img/welcome-to-rapidminer-titanic.png

JupyterLab

JupyterLab, a service included in RapidMiner AI Hub, provides a user-friendly interface for data analysis using R, or Python and Anaconda. Note that Python users have full access to Projects and other productivity tools provided by RapidMiner AI Hub, and can easily integrate their work.

img/welcome-to-rapidminer-notebooks.png

Consume the output of a data workflow / model

To monitor and evaluate the output of your data workflows, RapidMiner AI Hub includes Panopticon and Web API endpoints. A programmatic interface is available via a REST API and a Python API.

Panopticon

RapidMiner AI Hub includes Panopticon, supporting a wide range of information visualizations, including Treemaps, Heat Maps, Scatter Plots, Horizon Graphs, and other visualizations designed for fast comprehension and easy interpretation of static, time series, real-time streaming, and historic data sets.

img/panopticon-in-action.png

Web API endpoints

You can use Web API endpoints to remotely trigger RapidMiner processes, make predictions, and read the output in a browser or any other software that knows HTTP!

endpoints/img/scoring-agent.png

Productivity

RapidMiner AI Hub improves productivity by providing a shared workspace for re-use of connections, data, workflows, code, models and results. In this shared workspace:

Projects

RapidMiner AI Hub provides Projects so that the team will have a step-by-step history of the work they have done. The Projects back end is provided by Git, but no knowledge of Git is required to use this feature.

projects/img/connect/clone-url.png

Users and Groups

RapidMiner AI Hub provides role-based access control via users and groups. It can integrate with any SAML v2.0 and OAuth2 capable identity provider for identity federation, and with LDAP servers for user federation.

manage/users-groups/img/user-add.png

Schedules and queues

Long-running jobs can be dispatched more quickly and efficiently on server hardware. A queue system helps to organize these jobs, so that everyone gets their turn and so that the server never sleeps. If timing is important, your jobs can be scheduled, or triggered by an external event.

run-a-process/schedule-a-process/img/cron-editor-3.png