- 博客(100)
- 资源 (2)
- 收藏
- 关注
原创 scala之break使用
object BreakTest { def main(args: Array[String]): Unit = { // 第一种 import scala.util.control.Breaks._ breakable{ for(i<-1 to 10){ println(i) if(i==5){ break } } } // 第二种 import scala
2021-12-03 14:08:42
378
原创 scala之scan函数
函数scan 同fold,scan把每一步的计算结果放到一个新的集合中返回,flod返回的是最后的结果格式def scan[B >: Int, That](z: B)(op: (B, B) => B)(implicit cbf: scala.collection.generic.CanBuildFrom[Array[Int],B,That]): Thatdef scan[B >: Int, That](z: B)(op: (B, B) => B)(implicit cbf:
2021-12-02 15:44:51
2182
原创 scala之reduce函数
函数reduce同flod,不需要初始值格式def reduce[A1 >: Int](op: (A1, A1) => A1): A1数据val a = Array(1,2,3,4,5)scala> a.reduce((x,y)=>{println(x,y,x+y);x+y})(1,2,3)(3,3,6)(6,4,10)(10,5,15)res88: Int = 15函数reduceLeft格式override def reduceLeft[B &g
2021-12-02 15:00:47
1329
原创 scala之flatMap函数
函数flatMap格式def flatMap[B, That](f: Int => scala.collection.GenTraversableOnce[B])(implicit bf: scala.collection.generic.CanBuildFrom[Array[Int],B,That]): Thatdef flatMap[B, That](f: Int => scala.collection.GenTraversableOnce[B])(implicit bf: scala
2021-12-02 14:58:13
1742
原创 scala之fold函数
函数fold格式def fold[A1 >: Int](z: A1)(op: (A1, A1) => A1): A1数据scala> val a=Array(1,2,3,4,5)res16: Array[Int] = Array(1, 2, 3, 4, 5, 6)不分区scala> a.fold(5)((x,y)=>{println(x,y,x+y);x+y})(5,1,6)(6,2,8)(8,3,11)(11,4,15)(15,5,20)res1
2021-12-02 14:56:32
511
原创 scala之aggregate函数
函数aggregate格式def aggregate[S](z: => S)(seqop: (S, Int) => S,combop: (S, S) => S): S原理a.par.aggregate((0,0))((x,y)=>{(x._1+y,x._2+1)},(x,y)=>{(x._1+y._1,x._2+y._2)}) 可能会把List分为多个区,例如3个,p1(1,2,3,4),p2(5,6,7),p3(8,9),第一个方法区内聚合,(1+2+3+4,
2021-12-02 14:55:11
593
原创 scala 伴生类、伴生对象
/** * 伴生类 * 伴生对象 伴生类之间方法可以互相访问 */class Oop1(name:String,age:Int) { // scala默认的主构造函数 println("class Oop1 one") private var uname:String=name private var uage:Int=age private var uaddress:String= _ def this(){ this("",0) printl
2021-12-01 19:56:11
246
原创 scala 逆变、协变
/** * 协变 逆变 不变 */object ObjCovariantDemo { class Animal{ // 动物 println("动物吃东西") } class CatAnimal extends Animal { // 猫科动物 println("猫科动物吃肉") } class Tiger extends CatAnimal{ // 老虎属于猫科动物 println("老虎吃人") } class PetMaster[
2021-12-01 19:55:10
202
原创 scala连接mysql,增、查
class MysqlDemo { var driver="com.mysql.jdbc.Driver" var url="jdbc:mysql://192.168.133.151:3306/mybatisdb" var user="root" var pwd="root" def this(dirver:String,url:String,user:String,pwd:String){ this() this.driver=dirver this.url=u
2021-11-30 16:06:54
543
原创 scala 隐式参数、方法、类
package aa.bb/** * 全部存放着隐式 参数 方法 类 */object Impliciteg { //隐式参数 类型不能相同 // implicit val num1:Int=11 implicit val num2:Int=12 implicit val str:String="hello" //隐式方法 参数和返回值类型组合不能相同 implicit def stringToInt(value:String):Int={ Inte
2021-11-30 16:02:01
247
原创 scala 模式匹配
package aa.bbobject MatchDemo { def method1(x:Int):String={ if(x==1){ "one" }else if(x==2){ "two" }else{ "many" } } def method2:PartialFunction[Int,String]={ case 1=>"one" case 2=>"two" case _=&
2021-11-30 15:30:29
101
原创 scala 函数作为参数和返回值的两个例子
object functionDemo{ def funTest5(str1:String,str2:String,fun:(String,String)=>Int):(Int,Int)=>Int={ val len=fun(str1,str2) if(len>20){ (a,b)=>a*b }else{ (a,b)=>a%b } } println( funTest5("hello!how are y
2021-11-30 15:26:14
697
原创 scala 形参中带有*
object functionDemo{ def showmsg2(name:String,s:String*):Unit={ println(name) //wjj println(s.getClass) //class scala.collection.mutable.WrappedArray$ofRef for(str<-s){ println(str) //kb15,kb16 } } showm
2021-11-30 15:22:43
786
原创 scala 函数作为参数
object FunctionDemo { def funTest(f:(Int,Int)=>Int):Int={ val a=100 val b=50 f(a,b) } val sum:(Int,Int)=>Int=(a:Int,b:Int)=>a+b val ji:(Int,Int)=>Int=(a:Int,b:Int)=>a*b val result= funTest(ji) printl
2021-11-30 15:18:36
334
原创 scala 函数作为返回值
object FunctionDemo { def funTest2():(String,String)=>String={ def funDemo(str:String,str2:String):String={ str+"#####"+str2 } funDemo } println(funTest3()("1", "2")) //1#####2 def funTest1(num:Int):(Int,Int)=
2021-11-30 15:10:46
465
原创 scala 柯里化
object FunctionDemo { def fun(a:Int,b:Int,c:Int,d:Int):Int={ a+b+c+d } def fun2(a:Int,b:Int)(c:Int,d:Int):Int={ a+b+c+d } def fun3(a:Int,b:Int,c:Int)(d:Int):Int={ a+b+c+d } def fun4(a:Int)(b:Int,c:Int,d:Int):Int={ a+b+c+d }
2021-11-30 15:06:31
80
原创 scala 部分函数
object FunctionDemo { def alarmMsg(title:String,content:String,height:Double):Unit={ println(title+"\t"+content+": "+height) } alarmMsg("警报","空气甲醛浓度为",21.32d) //警报 空气甲醛浓度为: 21.32 val title:String="警报" def alarmMsg2=alarmMsg(title,_:String,_
2021-11-30 15:02:37
117
原创 scala 偏函数
偏函数常用于 数据转换object FunctionDemo { def funPatrtition3:PartialFunction[String,Int]={ case "男"=>1 case "male"=>1 case "女"=>0 case "female"=>0 case _=>(-1) } val arr = Array("男","女","male","female","中") val ints: A
2021-11-30 14:57:31
624
原创 hive拉链表
orders.txt1 2021-11-20 2021-11-20 创建2 2021-11-20 2021-11-20 创建3 2021-11-20 2021-11-20 创建hive> drop database lalian cascade;hive> create database lalian;hive> use lalian;hive> create table orders( orderid int, createdate s
2021-11-28 21:20:45
146
原创 sqoop job学习笔记
----------------------job-------------------------------[root@mihaoyu151 tmp]# sqoop job --create demojob -- import \> --connect jdbc:mysql://mihaoyu151:3306/mybatisdb \> --username root \> --password root \> --table student \> --delet
2021-11-28 21:08:51
733
原创 使用sqoop将hdfs数据导出到mysql
--------------------export 导出---------------------------------[root@mihaoyu151 tmp]# sqoop export \> --connect jdbc:mysql://mihaoyu151:3306/mybatisdb \> --username root \> --password root \> --table teacher \> --export-dir /sqoop/expor
2021-11-28 21:06:39
1202
原创 sqoop报错Output directory hdfs://mihaoyu151:9000/user/root/student already exists
错误日志ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://mihaoyu151:9000/user/root/student already exists[root@mihaoyu151 ~]# sqoop import \> --connect jdbc:my
2021-11-23 14:31:19
1276
原创 使用sqoop将mysql数据导入hive
-----------------------mysql to hive(一、先建表,再导入数据)------------------------------在hive中创建表kb15.student_mysql,数据结构与mysql中student相同[root@mihaoyu151 ~]# sqoop create-hive-table \> --connect jdbc:mysql://mihaoyu151:3306/mybatisdb \> --username root \&
2021-11-23 14:22:45
1349
原创 使用sqoop将mysql数据导入hdfs
---------------------mysql to hdfs--------------------------------[root@mihaoyu151 ~]# sqoop import \> --connect jdbc:mysql://mihaoyu151:3306/mybatisdb \> --username root \> --password root \> --table student \> --delete-target-dir \&
2021-11-23 14:21:24
1281
原创 sqoop安装教程 单机版
[root@mihaoyu151 ~]# ll /opt/install/total 2597160-rw-r--r--. 1 root root 16870735 Nov 23 07:42 sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz[root@mihaoyu151 ~]# tar -zxf /opt/install/sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /opt/soft/[root@mihao
2021-11-23 14:16:56
881
原创 hbase安装教程 单机版
提前安装好hadoop和zookeeper[root@mihaoyu151 ~]# cd /opt/install/[root@mihaoyu151 install]# lltotal 2580684-rw-r--r--. 1 root root 433895552 Oct 25 17:59 hadoop-2.6.0-cdh5.14.2.tar.gz-rw-r--r--. 1 root root 267038262 Nov 19 17:07 hbase-1.2.0-cdh5.14.2.tar
2021-11-22 15:33:18
847
原创 HBase idea Java客户端操作
创建Maven项目pom.xml <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>1.1.2</version> </dependency> <dependency> <grou
2021-11-22 15:26:50
668
原创 hbase常用命令
查看版本hbase(main):001:0> version查看状态hbase(main):002:0> status帮助hbase(main):006:0> help 'status'hbase(main):006:0> help 'get'hbase(main):007:0> help 'put'查询所有的表空间(库)hbase(main):008:0> list_namespace查询指定表空间下的表hbase(main):009:0&
2021-11-22 15:13:37
828
原创 hive与hbase的表关联
hbase创建表空间kb15并创建表customer,列族为addr、orderhbase(main):004:0> create_namespace 'kb15'hbase(main):063:0> create 'kb15:customer','addr','order'查看指定表空间kb15下的表hbase(main):064:0> list_namespace_tables 'kb15'TABLE
2021-11-22 15:06:45
843
原创 zeppelin配置和使用hive
添加hive配置文件和jar包到zeppelin[root@mihaoyu151 bin]# pwd/opt/soft/zeppelin090/bin[root@mihaoyu151 bin]# cd ../conf[root@mihaoyu151 conf]# cp /opt/soft/hive110/conf/hive-site.xml /opt/soft/zeppelin090/conf[root@mihaoyu151 conf]# cd ../interpreter/jdbc/[r
2021-11-15 15:16:18
915
原创 apache zeppelin安装
[root@mihaoyu151 ~]# cd /opt/install[root@mihaoyu151 install]# ll-rw-r--r--. 1 root root 1582476522 Nov 15 13:40 zeppelin-0.9.0-preview1-bin-all.tgz# 解压[root@mihaoyu151 install]# tar -zxf zeppelin-0.9.0-preview1-bin-all.tgz -C ../soft[root@mihaoyu15
2021-11-15 15:02:21
1177
原创 hive学习笔记——自定义函数
idea创建maven项目pom.xml<dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-exec</artifactId> <version>3.1.0</version></dependency>UDFOne.javaimport org.apache.hadoop.hive.ql.
2021-11-13 09:42:20
933
原创 hive学习笔记——窗口函数
0: jdbc:hive2://192.168.133.151:10000> create table t_window(. . . . . . . . . . . . . . . . . . .> name string,. . . . . . . . . . . . . . . . . . .> orderdate date,. . . . . . . . . . . . . . . . . . .> cost int. . . . . . . . . . . . . .
2021-11-13 09:32:37
915
原创 hive学习笔记——常用命令
show databases;数据库show functions;方法show tables;表show create table demo;表结构show partitions student2;表分区desc database hivedemo;数据库desc demo;表结果desc formatted demo;表use hivedemo;使用数据库drop table demotwo;删除表drop database hivedemo cascade;彻底删除数据库t
2021-11-10 16:34:06
929
原创 hive报错AccessControlException Permission denied: user=anonymous, access=WRITE, inode=“/user/hive/ware
错误日志2021-11-10 18:05:44,528 ERROR [HiveServer2-Background-Pool: Thread-6225]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(193)) - MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission de
2021-11-10 13:53:11
5646
原创 linux脚本自动安装hive
将hive-1.1.0-cdh5.14.2.tar.gz和mysql-connector-java-5.1.25.jar文件放入/opt/install文件夹下[root@mihaoyu151 ~]# vi /opt/mysyssh/autoinstall.sh#!/bin/bashhive=trueinstalldir=/opt/softif [ ! -d "$installdir" ]then mkdir $installdirfiif [ "$hive" = true ]th
2021-11-09 14:41:26
158
原创 hive安装 初始化报错Error: Duplicate key name ‘PCS_STATS_IDX‘ (state=42000,code=1061)
错误日志[root@mihaoyu151 conf]# schematool -dbType mysql -initSchemawhich: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/bin:/opt/soft/jdk180/bin:/opt/soft/zookeeper345/bin:/opt/soft/hadoop260/sbin:/opt/soft/hadoop260/bin:/opt/sof
2021-11-09 14:31:44
2051
原创 hive使用beeline启动报错Permission denied: user=anonymous, access=EXECUTE, inode=“/tmp“:root:supergroup:drw
错误日志[root@mihaoyu151 conf]# beeline -u jdbc:hive2://192.168.133.151:10000which: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/bin:/opt/soft/jdk180/bin:/opt/soft/zookeeper345/bin:/opt/soft/hadoop26/sbin:/opt/soft/hadoop260/bin:
2021-11-09 14:08:07
452
原创 hive启动的两种方式
一、hive[root@mihaoyu151 conf]# hivewhich: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/bin:/opt/soft/jdk180/bin:/opt/soft/zookeeper345/bin:/opt/soft/hadoop260/sbin:/opt/soft/hadoop260/bin:/opt/soft/jdk180/bin:/opt/soft/hadoop2
2021-11-09 14:04:56
3799
原创 linux hive安装
[root@mihaoyu151 install]# tar -zxf ./hive-1.1.0-cdh5.14.2.tar.gz -C /opt/soft/[root@mihaoyu151 install]# cd ../soft[root@mihaoyu151 soft]# mv hive-1.1.0-cdh5.14.2/ hive110[root@mihaoyu151 soft]# vi /etc/profile#hiveexport HIVE_HOME=/opt/soft/hive1
2021-11-09 13:55:38
1166
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人