This scenario applies if you want to install the entire HDF platform, consisting of all flow management and stream processing components on anewcluster.
The stream processing components include the new Streaming Analytics Manager (SAM) modules that are inGA(General Availability). This includes the SAM Stream Builder and Stream Operations modules butdoes notinclude installing the technical preview version of SAM Stream Insight, which is powered by Druid and Superset.
This scenario requires that you install an HDF cluster.
This scenario applies to you if you are both an Hortonworks Data Platform (HDP) and HDF customer and you want to install a fresh cluster of HDP and add HDF services.
The stream processing components include the new (SAM) andallof its modules. This includes installing the technical preview version of the SAM Stream Insight module, which is powered by Druid and Apache Superset.
This scenario requires that you install both an HDF cluster and an HDP cluster.
해당 페이지에 있는 내용 그대로 postgres에 쿼리 몇줄만 쓰면됩니다. 거의다 create database, table 이런거라 기존 ambari가 사용하는 테이블은 안건드리니 하셔도 될것 같습니다
Configure Postgres to Allow Remote Connections It is critical that you configure Postgres to allow remote connections before you deploy a cluster. If you do not perform these steps in advance of installing your cluster, the installation fails.
아마 이부분은 확인은 안해봤는데, ambari 깔릴때 사용하는 postgres를 하면 이미 되어있거나 그럴것 같네요
Create a database called registry with the password registry:
create database registry;
CREATE USER registry WITH PASSWORD 'registry';
GRANT ALL PRIVILEGES ON DATABASE "registry" to registry;
Create a database called streamline with the password streamline:
create database streamline;
CREATE USER streamline WITH PASSWORD 'streamline';
GRANT ALL PRIVILEGES ON DATABASE "streamline" to streamline;
Configure Druid and Superset Metadata Stores in Postgres Druid and Superset require a relational data store to store metadata. To use Postgres for this, install Postgres and create a database for the Druid metastore. If you have already created a data store using MySQL, you do not need to configure additional metadata stores in Postgres.
Log in to Postgres:
sudo su postgres
Create a database, user, and password, each called druid, and assign database privileges to the user druid:
create database druid;
CREATE USER druid WITH PASSWORD 'druid';
GRANT ALL PRIVILEGES ON DATABASE "druid" to druid;
Create a database, user, and password, each called superset, and assign database privileges to the user superset:
create database superset;
CREATE USER superset WITH PASSWORD 'superset';
GRANT ALL PRIVILEGES ON DATABASE "superset" to superset;Install HDF Management Pack
4. Install HDF Management Pack
Ambari 에 HDF 스택을 추가하는것입니다
Download the Hortonworks HDF management pack. You can find the download location for your operating system in the HDF Release Notes.
Copy the bundle to/tmpon the node where you installed Ambari.