大象教程
首页
Spark
Hadoop
HDFS
MapReduce
Hive
Pig 教程
Pig 教程
Pig 体系结构
Pig 安装
Pig 执行
Pig Grunt Shell
Pig Latin 基础
Pig 读取数据
Pig 存储数据
Pig Dump 运算符
Pig Describe 运算符
Pig Explain 运算符
Pig illustrate 运算符
Pig GROUP 运算符
Pig Cogroup 运算符
Pig JOIN 运算符
Pig Cross 运算符
Pig Union 运算符
Pig SPLIT 运算符
Pig FILTER 运算符
Pig DISTINCT 运算符
Pig FOREACH 运算符
Pig ORDER BY 运算符
Pig LIMIT 运算符
Pig eval(求值) 函数
Pig Load & Store 函数
Pig Bag & Tuple 函数
Pig 字符串(String) 函数
Pig 日期时间函数
Pig 数学函数
#Pig Cross 运算符 ##Cross 运算符 CROSS操作符计算两个或两个以上关系的向量叉积。本章举例说明如何在Pig Latin中使用Cross运算符。 语法 下面给出的是CROSS运算符的语法。 ```bash grunt> Relation3_name = CROSS Relation1_name, Relation2_name; ``` 假设我们在HDFS的/pig_data/目录中有两个文件,即customers.txt和orders.txt,如下所示。 ``` 1,Ramesh,32,Ahmedabad,2020.00 2,Khilan,25,Delhi,1500.00 3,kaushik,23,Kota,2020.00 4,Chaitali,25,Mumbai,6500.00 5,Hardik,27,Bhopal,8500.00 6,Komal,22,MP,4500.00 7,Muffy,24,Indore,10000.00 ``` ``` 102,2019-10-08 00:00:00,3,3000 100,2019-10-08 00:00:00,3,1500 101,2019-11-20 00:00:00,2,1560 103,2018-05-20 00:00:00,4,2060 ``` 我们已将这两个文件与客户和订单的关系加载到Pig中,如下所示。 ```bash grunt> customers = LOAD 'hdfs://localhost:9000/pig_data/customers.txt' USING PigStorage(',') as (id:int, name:chararray, age:int, address:chararray, salary:int); grunt> orders = LOAD 'hdfs://localhost:9000/pig_data/orders.txt' USING PigStorage(',') as (oid:int, date:chararray, customer_id:int, amount:int); ``` 现在让我们使用这两个关系上的cross 运算符获得这两个关系的向量叉积,如下所示。 ```bash grunt> cross_data = CROSS customers, orders; ``` **验证** 使用DUMP运算符验证关系cross_data,如下所示。 ```bash grunt> Dump cross_data; ``` **输出** 它将产生以下输出,显示关系cross_data的内容。 ``` (7,Muffy,24,Indore,10000,103,2018-05-20 00:00:00,4,2060) (7,Muffy,24,Indore,10000,101,2019-11-20 00:00:00,2,1560) (7,Muffy,24,Indore,10000,100,2019-10-08 00:00:00,3,1500) (7,Muffy,24,Indore,10000,102,2019-10-08 00:00:00,3,3000) (6,Komal,22,MP,4500,103,2018-05-20 00:00:00,4,2060) (6,Komal,22,MP,4500,101,2019-11-20 00:00:00,2,1560) (6,Komal,22,MP,4500,100,2019-10-08 00:00:00,3,1500) (6,Komal,22,MP,4500,102,2019-10-08 00:00:00,3,3000) (5,Hardik,27,Bhopal,8500,103,2018-05-20 00:00:00,4,2060) (5,Hardik,27,Bhopal,8500,101,2019-11-20 00:00:00,2,1560) (5,Hardik,27,Bhopal,8500,100,2019-10-08 00:00:00,3,1500) (5,Hardik,27,Bhopal,8500,102,2019-10-08 00:00:00,3,3000) (4,Chaitali,25,Mumbai,6500,103,2018-05-20 00:00:00,4,2060) (4,Chaitali,25,Mumbai,6500,101,2019-20 00:00:00,4,2060) (2,Khilan,25,Delhi,1500,101,2019-11-20 00:00:00,2,1560) (2,Khilan,25,Delhi,1500,100,2019-10-08 00:00:00,3,1500) (2,Khilan,25,Delhi,1500,102,2019-10-08 00:00:00,3,3000) (1,Ramesh,32,Ahmedabad,2020,103,2018-05-20 00:00:00,4,2060) (1,Ramesh,32,Ahmedabad,2020,101,2019-11-20 00:00:00,2,1560) (1,Ramesh,32,Ahmedabad,2020,100,2019-10-08 00:00:00,3,1500) (1,Ramesh,32,Ahmedabad,2020,102,2019-10-08 00:00:00,3,3000)-11-20 00:00:00,2,1560) (4,Chaitali,25,Mumbai,6500,100,2019-10-08 00:00:00,3,1500) (4,Chaitali,25,Mumbai,6500,102,2019-10-08 00:00:00,3,3000) (3,kaushik,23,Kota,2020,103,2018-05-20 00:00:00,4,2060) (3,kaushik,23,Kota,2020,101,2019-11-20 00:00:00,2,1560) (3,kaushik,23,Kota,2020,100,2019-10-08 00:00:00,3,1500) (3,kaushik,23,Kota,2020,102,2019-10-08 00:00:00,3,3000) (2,Khilan,25,Delhi,1500,103,2018-05-20 00:00:00,4,2060) (2,Khilan,25,Delhi,1500,101,2019-11-20 00:00:00,2,1560) (2,Khilan,25,Delhi,1500,100,2019-10-08 00:00:00,3,1500) (2,Khilan,25,Delhi,1500,102,2019-10-08 00:00:00,3,3000) (1,Ramesh,32,Ahmedabad,2020,103,2018-05-20 00:00:00,4,2060) (1,Ramesh,32,Ahmedabad,2020,101,2019-11-20 00:00:00,2,1560) (1,Ramesh,32,Ahmedabad,2020,100,2019-10-08 00:00:00,3,1500) (1,Ramesh,32,Ahmedabad,2020,102,2019-10-08 00:00:00,3,3000) ```
加我微信交流吧