hadoop@Master:~$ sqoop import --connect jdbc:mysql://192.168.1.178/hivetestdb --username chu888chu888 --password skybar --table cdsgus --m 2 --target-dir /user/sqoop/cdsgus Warning: /usr/local/sqoop1/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/local/sqoop1/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: /usr/local/sqoop1/../zookeeper does not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation. 16/03/0301:28:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6 16/03/0301:28:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 16/03/0301:28:13 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 16/03/0301:28:13 INFO tool.CodeGenTool: Beginning code generation 16/03/0301:28:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `cdsgus` AS t LIMIT 1 16/03/0301:28:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `cdsgus` AS t LIMIT 1 16/03/0301:28:14 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop Note: /tmp/sqoop-hadoop/compile/7b9cf86a577c124c063ff5dc2242b3fb/cdsgus.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 16/03/0301:28:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/7b9cf86a577c124c063ff5dc2242b3fb/cdsgus.jar 16/03/0301:28:17 WARN manager.MySQLManager: It looks like you are importing from mysql. 16/03/0301:28:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 16/03/0301:28:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 16/03/0301:28:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 16/03/0301:28:17 INFO mapreduce.ImportJobBase: Beginning import of cdsgus SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 16/03/0301:28:17 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 16/03/0301:28:18 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 16/03/0301:28:18 INFO client.RMProxy: Connecting to ResourceManager at Master/192.168.1.80:8032 16/03/0301:28:23 INFO db.DBInputFormat: Using read commited transaction isolation 16/03/0301:28:23 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `cdsgus` 16/03/0301:28:23 INFO mapreduce.JobSubmitter: number of splits:2 16/03/0301:28:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1456939431067_0002 16/03/0301:28:24 INFO impl.YarnClientImpl: Submitted application application_1456939431067_0002 16/03/0301:28:24 INFO mapreduce.Job: The url to track the job: http://Master:8088/proxy/application_1456939431067_0002/ 16/03/0301:28:24 INFO mapreduce.Job: Running job: job_1456939431067_0002 16/03/0301:28:38 INFO mapreduce.Job: Job job_1456939431067_0002 running in uber mode : false 16/03/0301:28:38 INFO mapreduce.Job: map 0% reduce 0% 16/03/0301:32:11 INFO mapreduce.Job: map 50% reduce 0% 16/03/0301:32:13 INFO mapreduce.Job: map 100% reduce 0% 16/03/0301:32:14 INFO mapreduce.Job: Job job_1456939431067_0002 completed successfully 16/03/0301:32:14 INFO mapreduce.Job: Counters: 31 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=247110 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=218 HDFS: Number of bytes written=3130492684 HDFS: Number of read operations=8 HDFS: Number of large read operations=0 HDFS: Number of write operations=4 Job Counters Killed map tasks=1 Launched map tasks=3 Other local map tasks=3 Total time spent by all maps in occupied slots (ms)=422821 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=422821 Total vcore-seconds taken by all map tasks=422821 Total megabyte-seconds taken by all map tasks=432968704 Map-Reduce Framework Map input records=20050144 Map output records=20050144 Input split bytes=218 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=19391 CPU time spent (ms)=206680 Physical memory (bytes) snapshot=313565184 Virtual memory (bytes) snapshot=3757293568 Total committed heap usage (bytes)=65142784 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=3130492684 16/03/0301:32:14 INFO mapreduce.ImportJobBase: Transferred 2.9155 GB in 235.5966 seconds (12.672 MB/sec) 16/03/0301:32:14 INFO mapreduce.ImportJobBase: Retrieved 20050144 records.
导出mysql表全部数据到hive
1 2 3 4 5 6 7 8 9 10 11
hive> createdatabase test_sqoop; OK Time taken: 0.81 seconds hive> show databases; OK chu888chu888 default test_sqoop Time taken: 0.247 seconds, Fetched: 3row(s) hive>
hadoop@Master:/$ sqoop import --connect jdbc:mysql://192.168.1.178/hivetestdb --username chu888chu888 --password skybar --table cdsgus Warning: /usr/local/sqoop1/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/local/sqoop1/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: /usr/local/sqoop1/../zookeeper does not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation. 16/03/0300:32:16 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6 16/03/0300:32:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 16/03/0300:32:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 16/03/0300:32:16 INFO tool.CodeGenTool: Beginning code generation 16/03/0300:32:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `cdsgus` AS t LIMIT 1 16/03/0300:32:16 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@654f0d9c is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@654f0d9c is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:914) at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2181) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1542) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723) at com.mysql.jdbc.Connection.execSQL(Connection.java:3277) at com.mysql.jdbc.Connection.execSQL(Connection.java:3206) at com.mysql.jdbc.Statement.executeQuery(Statement.java:1232) at com.mysql.jdbc.Connection.getMaxBytesPerChar(Connection.java:3673) at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:482) at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:443) at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286) at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241) at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227) at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295) at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833) at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 16/03/0300:32:17 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
hadoop@hadoopmaster:/usr/local/hive/lib$ sqoop list-databases --connect jdbc:mysql://hadoopslave2 --username hive --password hive Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.