문제

I usually have common/shared libraries and actual job code in different jar files. Is it possible to recompile only one job jar file to execute hadoop command hadoop jar asd? If not is there workaround to simplify jar packaging?

도움이 되었습니까?

해결책

I am using Ant to make the job jar. To include all common/shared libraries in the buildConfig.xml file you have to add such line:

<zipgroupfileset dir="pathToAllCommonAndSharedLibraries" includes="**/*.jar" />

Here is the simplest example of build config file.

<?xml version="1.0" encoding="UTF-8"?>
<project name="Example.makejar" default="jar" basedir=".">
    <target name="jar">
        <jar destfile="Example.jar" basedir="bin">
            <manifest></manifest>
            <zipgroupfileset dir="pathToAllCommonAndSharedLibraries" includes="**/*.jar" />
        </jar>
    </target>
</project>

다른 팁

I am not sure about the hadoop support for this, I will ask later, but here's the workaround: if you use maven to build your job projects, use maven shade plugin, or maven assembly plugin to embed all your dependencies into the jar file, so that you will deploy only one file.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top