Opened 8 years ago

Last modified 8 years ago

#53595 new defect

hadoop fails to install before and after selfupdate to 2.40

Reported by: TheLastLovemark Owned by:
Priority: Normal Milestone:
Component: ports Version: 2.4.0
Keywords: Cc:
Port: hadoop

Description

As the summary says, Hadoop failed to install.

There are no maintainers specified after executing "port info --maintainers hadoop"

The main.log file is attached

I am not sure if there is a conflict with some other port. I have included a list of every port installed (active/inactive) as well.

Attachments (3)

hadoop_main_log.txt (32.9 KB) - added by TheLastLovemark 8 years ago.
main.log
installed_02172017.txt (41.2 KB) - added by TheLastLovemark 8 years ago.
installed ports
main.log (33.0 KB) - added by TheLastLovemark 8 years ago.
Hadoop Logfile

Download all attachments as: .zip

Change History (26)

Changed 8 years ago by TheLastLovemark

Attachment: hadoop_main_log.txt added

main.log

Changed 8 years ago by TheLastLovemark

Attachment: installed_02172017.txt added

installed ports

comment:1 Changed 8 years ago by mf2k (Frank Schima)

Keywords: hadoop port install fail removed

comment:2 Changed 8 years ago by TheLastLovemark

I read this: #49113

I have Java 1.6 and 1.8 installed because I run older versions of Adobe Creative Suite. I think 1.6 is running as system default.

Last edited 8 years ago by ryandesign (Ryan Carsten Schmidt) (previous) (diff)

comment:3 Changed 8 years ago by kencu (Ken)

hmm.

"/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1" && ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
info:build Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0

what version of ant are you running?

$ ant -v
Apache Ant(TM) version 1.8.4 compiled on August 8 2013

comment:4 Changed 8 years ago by TheLastLovemark

sh-3.2# ant -v
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

I'm guessing this has to do with the Java issue I mentioned. How do I switch the default version of Java?

Last edited 8 years ago by TheLastLovemark (previous) (diff)

comment:5 in reply to:  3 Changed 8 years ago by ryandesign (Ryan Carsten Schmidt)

Replying to kencu:

what version of ant are you running?

$ ant -v
Apache Ant(TM) version 1.8.4 compiled on August 8 2013

You appear to be out of date, Ken. In MacPorts we currently have:

Apache Ant(TM) version 1.10.1 compiled on February 2 2017

comment:6 Changed 8 years ago by kencu (Ken)

yeah - that was the /usr/bin/ant system version on the 10.6.8 machine I use at work :>

@TheLastLovemark

looks like you're closer to the solution to your problem. You need to figure out why ant won't work. Sounds like you've been messing with java some. But you're on the path toward your fix, anyway.

comment:7 Changed 8 years ago by TheLastLovemark

@ken I haven't "been messing with" it so much so as Apple Java 1.6 is required for Adobe Creative Suite and I had to install JVM 1.6 for it to work.

comment:8 Changed 8 years ago by TheLastLovemark

Here's what I've found on my system:

$ which java
/usr/bin/java
$ java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-468-11M4833)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-468, mixed mode)
$ pwd
/Library/Internet Plug-Ins/JavaAppletPlugin.plugin/Contents/Home/bin
$ ./java -version
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
$ cd /usr/bin/java
-bash: cd: /usr/bin/java: Not a directory
$ cd /usr/bin/
$ ./java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-468-11M4833)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-468, mixed mode)
$ pwd
/usr/bin

How do I switch? Will switching cause problems with other stuff?

Last edited 8 years ago by TheLastLovemark (previous) (diff)

comment:9 Changed 8 years ago by kencu (Ken)

Google is your friend ... this looks promising, but I haven't tried it <https://wimdeblauwe.wordpress.com/2014/03/20/switching-easily-between-java-jdks-on-mac-os-x/>

No idea about what problems it might cause, sorry.

comment:10 Changed 8 years ago by TheLastLovemark

I've bee trying that. The problem is that on macos Sierra, the latest version of Java (1.8.0_121) installs to /Library/Internet Plug-Ins/JavaAppletPlugin.plugin/Contents/Home/bin

Not to Library/Java/JavaVirtualMachines/<JDK VERSION HERE>/Contents/Home/ like every example I have come across.

Either Hadoop/Ant can't run on Java 1.6 or neither of them can find Java 1.8 because they don't know where to look. I think it's both. Any help would be really appreciated.

1.8 is not even in the list when I type

$ /Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-468-11M4833)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-468, mixed mode)

comment:11 Changed 8 years ago by TheLastLovemark

OK... I got java in order. You have to install the full JDK, not the JRE.

Hadoop still fails. I have traced it to the macports environment. Java 1.6 is specified in the JAVA_HOME variable:

--->  Building hadoop
DEBUG: Executing org.macports.build (hadoop)
DEBUG: Environment: 
CC_PRINT_OPTIONS='YES'
CC_PRINT_OPTIONS_FILE='/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/.CC_PRINT_OPTIONS'
CPATH='/opt/local/include'
JAVA_HOME='/System/Library/Frameworks/JavaVM.framework/Versions/1.6/Home'
LIBRARY_PATH='/opt/local/lib'
MACOSX_DEPLOYMENT_TARGET='10.12'

I have to edit this now I guess. I wonder how long this will take to figure out...

comment:12 Changed 8 years ago by kencu (Ken)

Java 1.6 is hard-coded into the port file

cat `port file hadoop`

and, in fact, seems to be required:

set java_home       /System/Library/Frameworks/JavaVM.framework/Versions/1.6/Home

pre-fetch {
    # This port works on Mac OS X 10.6 or later, because 'sudo option -E'
    # is not available on Mac OS X 10.5 or earlier. See #34665.
    if {${os.platform} eq "darwin" && ${os.major} <= 9} {
        ui_error "This port works on Mac OS X 10.6 (Snow Leopard) or later"
        return -code error "Mac OS X 10.6 (Snow Leopard) or later is required"
    }
    if {![file exists ${java_home}]} {
        ui_error "Java 1.6 is required, but not located at ${java_home}"
        return -code error "Java 1.6 missing"
    }
}

comment:13 Changed 8 years ago by TheLastLovemark

I can't figure this out.

comment:14 Changed 8 years ago by kencu (Ken)

OK. you need to get ant to work. let's see what's going on. Try this:

which ant

I get /usr/bin/ant. Then see what that is:

ls -la /usr/bin/ant

I get this:

lrwxr-xr-x  1 root  wheel  22  1 Sep 13:52 /usr/bin/ant -> /usr/share/ant/bin/ant

OK. A link. Let's see what that points to:

ls -la /usr/share/ant/bin/ant
-rwxr-xr-x  1 root  wheel  9973  1 Sep 13:52 /usr/share/ant/bin/ant

all right. Go there:

cd /usr/share/ant/bin/

and see what's up

$ ls -la
total 0
drwxr-xr-x  8 root  wheel   272  3 Jun  2011 .
drwxr-xr-x  7 root  wheel   238  3 Jun  2011 ..
-rwxr-xr-x  1 root  wheel  9973  1 Sep 13:52 ant
-rwxr-xr-x  1 root  wheel   861  1 Sep 13:52 antRun
-rwxr-xr-x  1 root  wheel  2199  1 Sep 13:52 antRun.pl
-rwxr-xr-x  1 root  wheel  3219  1 Sep 13:52 complete-ant-cmd.pl
-rwxr-xr-x  1 root  wheel  4422  1 Sep 13:52 runant.pl
-rwxr-xr-x  1 root  wheel  3401  1 Sep 13:52 runant.py

OK. cat ./ant shows it's just a big long script.

Let's run it

$ ./ant -version
Apache Ant(TM) version 1.8.2 compiled on June 3 2011

would you like to try some of these steps and see if your links match up?

In the end, you need to get ant -version to give you the expected output.

comment:15 Changed 8 years ago by kencu (Ken)

You won't want to hear this perhaps, but I just installed hadoop to see what's up, and it installed in a couple of minutes without any trouble (on this 10.7 machine I had here at hand). I guess it must be something about your java setup that has been changed with whatever you did to allow Adobe to work.

$ port -v installed hadoop
The following ports are currently installed:
  hadoop @1.2.1_0+pseudo (active) platform='darwin 11' archs='x86_64'

comment:16 Changed 8 years ago by TheLastLovemark

OK. So last night I managed to get the full 1.8 and 1.7 JDKs installed then went to sleep. When I woke up, I did your ant -version check as root.

Here are the full results (line breaks added for clarity):

sh-3.2# which ant
/opt/local/bin/ant

sh-3.2# ls -la /opt/local/bin/ant
lrwxr-xr-x  1 root  admin  32 Feb  6 22:34 /opt/local/bin/ant -> ../share/java/apache-ant/bin/ant

sh-3.2# ls -la /opt/local/share/java/apache-ant/bin/ant
-rwxr-xr-x  1 root  admin  11698 Feb  2 14:00 /opt/local/share/java/apache-ant/bin/ant

sh-3.2# cd /opt/local/share/java/apache-ant/bin/

sh-3.2# ls -la
total 72
drwxr-xr-x   8 root  admin    272 Feb 17 13:24 .
drwxr-xr-x  10 root  admin    340 Feb 17 13:24 ..
-rwxr-xr-x   1 root  admin  11698 Feb  2 14:00 ant
-rwxr-xr-x   1 root  admin    861 Feb  2 14:00 antRun
-rwxr-xr-x   1 root  admin   2116 Feb  2 14:00 antRun.pl
-rwxr-xr-x   1 root  admin   3473 Feb  2 14:00 complete-ant-cmd.pl
-rwxr-xr-x   1 root  admin   4333 Feb  2 14:00 runant.pl
-rwxr-xr-x   1 root  admin   3385 Feb  2 14:00 runant.py

sh-3.2# cat ./ant
#! /bin/sh

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Extract launch and ant arguments, (see details below).
ant_exec_args=
no_config=false
use_jikes_default=false
ant_exec_debug=false
show_help=false

if [ -z "$PROTECT_NL" ]
then
  PROTECT_NL=true
  os=`uname -s`
  rel=`uname -r`
  # heirloom bourne-shell used by Solaris 10 is not POSIX
  #  it lacks features necessary to protect trailing NL from subshell trimming
  if [ "$os" = SunOS -a "$rel" = "5.10" ]
  then
    PROTECT_NL=false
  fi
fi

for arg in "$@" ; do
  if [ "$arg" = "--noconfig" ] ; then
    no_config=true
  elif [ "$arg" = "--usejikes" ] ; then
    use_jikes_default=true
  elif [ "$arg" = "--execdebug" ] ; then
    ant_exec_debug=true
  elif [ my"$arg" = my"--h"  -o my"$arg" = my"--help"  ] ; then
    show_help=true
    ant_exec_args="$ant_exec_args -h"
  else
    if [  my"$arg" = my"-h"  -o  my"$arg" = my"-help" ] ; then
      show_help=true
    fi

    if [ "$PROTECT_NL" = "true" ] ; then
      # pad the value with X to protect trailing NLs from subshell output trimming
      esc_arg="${arg}X"
    else
      esc_arg="${arg}"
    fi

    # wrap all arguments as "" strings, escape any internal back-slash, double-quote, $, or back-tick characters
    #  use printf to avoid echo interpretation behaviors such as escapes and line continuation
    # Mac bsd_sed does not support group-0, so pattern uses group-1
    # Solaris sed only processes lines with trailing newline, passing in an extra newline
    # subshell (heirloom and posix) will trim the added trailing newline
    esc_arg="`printf '%s\n' "$esc_arg" | sed -e 's@\([$\"\`\\]\)@\\\\\\1@g' `"

    if [ "$PROTECT_NL" = "true" ] ; then
      # remove the padding X added above, this syntax is POSIX compatible but not heirloom-sh
      esc_arg="${esc_arg%X}"
    fi
    quoted_arg="\"$esc_arg\""

    if $ant_exec_debug
    then
        # using printf to avoid echo line continuation and escape interpretation
        printf "arg       : %s\n" "$arg"
        printf "quoted_arg: %s\n" "$quoted_arg"
    fi
    ant_exec_args="$ant_exec_args $quoted_arg"
  fi
done

# Source/default ant configuration
if $no_config ; then
  rpm_mode=false
  usejikes=$use_jikes_default
else
  # load system-wide ant configuration (ONLY if ANT_HOME has NOT been set)
  if [ -z "$ANT_HOME" -o "$ANT_HOME" = "/usr/share/ant" ]; then
      if [ -f "/etc/ant.conf" ] ; then
          . /etc/ant.conf
      fi
  fi

  # load user ant configuration
  if [ -f "$HOME/.ant/ant.conf" ] ; then
    . $HOME/.ant/ant.conf
  fi
  if [ -f "$HOME/.antrc" ] ; then
    . "$HOME/.antrc"
  fi

  # provide default configuration values
  if [ -z "$rpm_mode" ] ; then
    rpm_mode=false
  fi
  if [ -z "$usejikes" ] ; then
    usejikes=$use_jikes_default
  fi
fi

# Setup Java environment in rpm mode
if $rpm_mode ; then
  if [ -f /usr/share/java-utils/java-functions ] ; then
    . /usr/share/java-utils/java-functions
    set_jvm
    set_javacmd
  fi
fi

# OS specific support.  $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false;
case "`uname`" in
  CYGWIN*) cygwin=true ;;
  Darwin*) darwin=true
           if [ -z "$JAVA_HOME" ] ; then
               if [ -x '/usr/libexec/java_home' ] ; then
                   JAVA_HOME=`/usr/libexec/java_home`
               elif [ -d "/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home" ]; then
                   JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home
               fi
           fi
           ;;
  MINGW*) mingw=true ;;
esac

if [ -z "$ANT_HOME" -o ! -d "$ANT_HOME" ] ; then
  ## resolve links - $0 may be a link to ant's home
  PRG="$0"
  progname=`basename "$0"`

  # need this for relative symlinks
  while [ -h "$PRG" ] ; do
    ls=`ls -ld "$PRG"`
    link=`expr "$ls" : '.*-> \(.*\)$'`
    if expr "$link" : '/.*' > /dev/null; then
    PRG="$link"
    else
    PRG=`dirname "$PRG"`"/$link"
    fi
  done

  ANT_HOME=`dirname "$PRG"`/..

  # make it fully qualified
  ANT_HOME=`cd "$ANT_HOME" > /dev/null && pwd`
fi

# For Cygwin and Mingw, ensure paths are in UNIX format before
# anything is touched
if $cygwin ; then
  [ -n "$ANT_HOME" ] &&
    ANT_HOME=`cygpath --unix "$ANT_HOME"`
  [ -n "$JAVA_HOME" ] &&
    JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
fi
if $mingw ; then
  [ -n "$ANT_HOME" ] &&
    ANT_HOME="`(cd "$ANT_HOME"; pwd)`"
  [ -n "$JAVA_HOME" ] &&
    JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
fi

# set ANT_LIB location
ANT_LIB="${ANT_HOME}/lib"

if [ -z "$JAVACMD" ] ; then
  if [ -n "$JAVA_HOME"  ] ; then
    # IBM's JDK on AIX uses strange locations for the executables
    if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
      JAVACMD="$JAVA_HOME/jre/sh/java"
    elif [ -x "$JAVA_HOME/jre/bin/java" ] ; then
      JAVACMD="$JAVA_HOME/jre/bin/java"
    else
      JAVACMD="$JAVA_HOME/bin/java"
    fi
  else
    JAVACMD=`which java 2> /dev/null `
    if [ -z "$JAVACMD" ] ; then
        JAVACMD=java
    fi
  fi
fi

if [ ! -x "$JAVACMD" ] ; then
  echo "Error: JAVA_HOME is not defined correctly."
  echo "  We cannot execute $JAVACMD"
  exit 1
fi

# Build local classpath using just the launcher in non-rpm mode or
# use the Jpackage helper in rpm mode with basic and default jars
# specified in the ant.conf configuration. Because the launcher is
# used, libraries linked in ANT_HOME/lib will also be included, but this
# is discouraged as it is not java-version safe. A user should
# request optional jars and their dependencies via the OPT_JAR_LIST
# variable
if $rpm_mode && [ -x /usr/bin/build-classpath ] ; then
  LOCALCLASSPATH="$(/usr/bin/build-classpath ant ant-launcher jaxp_parser_impl xml-commons-apis)"

  # If no optional jars have been specified then build the default list
  if [ -z "$OPT_JAR_LIST" ] ; then
    for file in /etc/ant.d/*; do
      if [ -f "$file" ]; then
        case "$file" in
        *~) ;;
        *#*) ;;
        *.rpmsave) ;;
        *.rpmnew) ;;
        *)
          for dep in `cat "$file"`; do
            OPT_JAR_LIST="$OPT_JAR_LIST${OPT_JAR_LIST:+ }$dep"
          done
        esac
      fi
    done
  fi

  # If the user requested to try to add some other jars to the classpath
  if [ -n "$OPT_JAR_LIST" ] ; then
    _OPTCLASSPATH="$(/usr/bin/build-classpath $OPT_JAR_LIST 2> /dev/null)"
    if [ -n "$_OPTCLASSPATH" ] ; then 
      LOCALCLASSPATH="$LOCALCLASSPATH:$_OPTCLASSPATH"
    fi
  fi

  # Explicitly add javac path to classpath, assume JAVA_HOME set
  # properly in rpm mode
  if [ -f "$JAVA_HOME/lib/tools.jar" ] ; then
    LOCALCLASSPATH="$LOCALCLASSPATH:$JAVA_HOME/lib/tools.jar"
  fi
  if [ -f "$JAVA_HOME/lib/classes.zip" ] ; then
    LOCALCLASSPATH="$LOCALCLASSPATH:$JAVA_HOME/lib/classes.zip"
  fi

  # if CLASSPATH_OVERRIDE env var is set, LOCALCLASSPATH will be
  # user CLASSPATH first and ant-found jars after.
  # In that case, the user CLASSPATH will override ant-found jars
  #
  # if CLASSPATH_OVERRIDE is not set, we'll have the normal behaviour
  # with ant-found jars first and user CLASSPATH after
  if [ -n "$CLASSPATH" ] ; then
    # merge local and specified classpath 
    if [ -z "$LOCALCLASSPATH" ] ; then 
      LOCALCLASSPATH="$CLASSPATH"
    elif [ -n "$CLASSPATH_OVERRIDE" ] ; then
      LOCALCLASSPATH="$CLASSPATH:$LOCALCLASSPATH"
    else
      LOCALCLASSPATH="$LOCALCLASSPATH:$CLASSPATH"
    fi

    # remove class path from launcher -cp option
    CLASSPATH=""
  fi
else
  # not using rpm_mode; use launcher to determine classpaths
  if [ -z "$LOCALCLASSPATH" ] ; then
      LOCALCLASSPATH=$ANT_LIB/ant-launcher.jar
  else
      LOCALCLASSPATH=$ANT_LIB/ant-launcher.jar:$LOCALCLASSPATH
  fi
fi

if [ -n "$JAVA_HOME" ] ; then
  # OSX hack to make Ant work with jikes
  if $darwin ; then
    OSXHACK="${JAVA_HOME}/../Classes"
    if [ -d "${OSXHACK}" ] ; then
      for i in "${OSXHACK}"/*.jar
      do
        JIKESPATH="$JIKESPATH:$i"
      done
    fi
  fi
fi

# Allow Jikes support (off by default)
if $usejikes; then
  ANT_OPTS="$ANT_OPTS -Dbuild.compiler=jikes"
fi

# For Cygwin, switch paths to appropriate format before running java
# For PATHs convert to unix format first, then to windows format to ensure
# both formats are supported. Probably this will fail on directories with ;
# in the name in the path. Let's assume that paths containing ; are more
# rare than windows style paths on cygwin.
if $cygwin; then
  if [ "$OS" = "Windows_NT" ] && cygpath -m .>/dev/null 2>/dev/null ; then
    format=mixed
  else
    format=windows
  fi
  [ -n "$ANT_HOME" ] && ANT_HOME=`cygpath --$format "$ANT_HOME"`
  ANT_LIB=`cygpath --$format "$ANT_LIB"`
  [ -n "$JAVA_HOME" ] && JAVA_HOME=`cygpath --$format "$JAVA_HOME"`
  LCP_TEMP=`cygpath --path --unix "$LOCALCLASSPATH"`
  LOCALCLASSPATH=`cygpath --path --$format "$LCP_TEMP"`
  if [ -n "$CLASSPATH" ] ; then
    CP_TEMP=`cygpath --path --unix "$CLASSPATH"`
    CLASSPATH=`cygpath --path --$format "$CP_TEMP"`
  fi
  CYGHOME=`cygpath --$format "$HOME"`
fi

# Show script help if requested
if $show_help ; then
  echo $0 '[script options] [options] [target [target2 [target3] ..]]'
  echo 'Script Options:'
  echo '  --help, --h            print this message and ant help'
  echo '  --noconfig             suppress sourcing of /etc/ant.conf,'
  echo '                         $HOME/.ant/ant.conf, and $HOME/.antrc'
  echo '                         configuration files'
  echo '  --usejikes             enable use of jikes by default, unless'
  echo '                         set explicitly in configuration files'
  echo '  --execdebug            print ant exec line generated by this'
  echo '                         launch script'
  echo '  '
fi
# add a second backslash to variables terminated by a backslash under cygwin
if $cygwin; then
  case "$ANT_HOME" in
    *\\ )
    ANT_HOME="$ANT_HOME\\"
    ;;
  esac
  case "$CYGHOME" in
    *\\ )
    CYGHOME="$CYGHOME\\"
    ;;
  esac
  case "$JIKESPATH" in
    *\\ )
    JIKESPATH="$JIKESPATH\\"
    ;;
  esac
  case "$LOCALCLASSPATH" in
    *\\ )
    LOCALCLASSPATH="$LOCALCLASSPATH\\"
    ;;
  esac
  case "$CLASSPATH" in
    *\\ )
    CLASSPATH="$CLASSPATH\\"
    ;;
  esac
fi
# Execute ant using eval/exec to preserve spaces in paths,
# java options, and ant args
ant_sys_opts=
if [ -n "$CYGHOME" ]; then
  if [ -n "$JIKESPATH" ]; then
    ant_sys_opts="-Djikes.class.path=\"$JIKESPATH\" -Dcygwin.user.home=\"$CYGHOME\""
  else
    ant_sys_opts="-Dcygwin.user.home=\"$CYGHOME\""
  fi
else
  if [ -n "$JIKESPATH" ]; then
    ant_sys_opts="-Djikes.class.path=\"$JIKESPATH\""
  fi
fi
ant_exec_command="exec \"\$JAVACMD\" $ANT_OPTS -classpath \"\$LOCALCLASSPATH\" -Dant.home=\"\$ANT_HOME\" -Dant.library.dir=\"\$ANT_LIB\" $ant_sys_opts org.apache.tools.ant.launch.Launcher $ANT_ARGS -cp \"\$CLASSPATH\""
if $ant_exec_debug ; then
    # using printf to avoid echo line continuation and escape interpretation confusion
    printf "%s\n" "$ant_exec_command $ant_exec_args"
fi

eval "$ant_exec_command $ant_exec_args"
sh-3.2# ./ant -version
Apache Ant(TM) version 1.10.1 compiled on February 2 2017

It seems like it worked, with a few minor differences. Ant is installed in a different location for me and my total is different on the last ls -la but Hadoop still fails to install.

I am on a mid-2010 MBP with a 960GB SSD, 8GB RAM running macOS Sierra 10.12.3. I didn't change anything really. After I upgraded to Sierra, I made sure I installed Apple-Java 1.6 so that Creative Suite would work. I used the default installer settings, no modifications at all. Updated Macports as described in the guide, but I did not have the full Java 1.8 or 1.7 JDK installed then. Just the 1.8 JRE.

It feels like we are slowly moving towards identifying the problem.

Are there any further steps I should try?

comment:17 Changed 8 years ago by kencu (Ken)

Well, looks like you have ant working. Try opeming a new term window, and just type 'ant -version '.

If that works, I try Hadoop again.

comment:18 Changed 8 years ago by TheLastLovemark

Nope... still not working.

Here's the non-verbose/debug result:

$ ant -version
Apache Ant(TM) version 1.10.1 compiled on February 2 2017

$ sudo port install hadoop
Password:

--->  Computing dependencies for hadoop
--->  Fetching archive for hadoop
--->  Attempting to fetch hadoop-1.2.1_0+pseudo.darwin_16.x86_64.tbz2 from http://fco.it.packages.macports.org/mirrors/macports-packages/hadoop/hadoop
--->  Attempting to fetch hadoop-1.2.1_0+pseudo.darwin_16.x86_64.tbz2 from https://packages.macports.org/hadoop
--->  Attempting to fetch hadoop-1.2.1_0+pseudo.darwin_16.x86_64.tbz2 from http://jnb.za.packages.macports.org/packages/hadoop
--->  Fetching distfiles for hadoop
--->  Verifying checksums for hadoop
--->  Extracting hadoop
--->  Applying patches to hadoop
--->  Configuring hadoop
--->  Building hadoop
Error: Failed to build hadoop: command execution failed
Error: See /opt/local/var/macports/logs/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/main.log for details.
Error: Follow https://guide.macports.org/#project.tickets to report a bug.
Error: Processing of port hadoop failed

$ sudo port clean hadoop
--->  Cleaning hadoop

$

comment:19 Changed 8 years ago by kencu (Ken)

Now that you have ant working, the previous error you had (broken ant) should be gone. Please attach a new log so we can help you see what is happening now, if you like.

Changed 8 years ago by TheLastLovemark

Attachment: main.log added

Hadoop Logfile

comment:20 Changed 8 years ago by TheLastLovemark

Attached

comment:21 Changed 8 years ago by kencu (Ken)

thanks. Doesn't build for me on Sierra either, although different error. Not sure at what system version it stopped working - the buildbots could probably tell us that. I am not certain what is wrong. This is the failing command:

:debug:build system:  cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1" && ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true

what I do in these situations, is break it up and try it manually. So I would do this:

cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1"

and then (using sudo):

sudo  ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true

that might give you better error messages into why that command is failing.

Perhaps someone who knows more about busted Java apps can help out. Sorry no magic fix.

comment:22 Changed 8 years ago by TheLastLovemark

I did this as root, so no need for sudo:

sh-3.2# cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1"

sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

sh-3.2# 

I tried your solution with ant using 1.6, 1.7 and 1.8. The latter gave some interesting results, but ultimately failed...

sh-3.2# setJdk6

sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

sh-3.2# setJdk7

sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

sh-3.2# setJdk8

sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true -Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true

Buildfile: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml

compile-native:

create-native-configure:
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] glibtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, 'config'.
     [exec] glibtoolize: copying file 'config/ltmain.sh'
     [exec] glibtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.ac,
     [exec] glibtoolize: and rerunning glibtoolize and aclocal.
     [exec] glibtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in Makefile.am.
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
     [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
     [exec] configure.ac:42: the top level
     [exec] configure.ac:44: warning: AM_INIT_AUTOMAKE: two- and three-arguments forms are deprecated.  For more info, see:
     [exec] configure.ac:44: http://www.gnu.org/software/automake/manual/automake.html#Modernize-AM_005fINIT_005fAUTOMAKE-invocation
     [exec] configure.ac:41: installing 'config/compile'
     [exec] configure.ac:44: installing 'config/missing'
     [exec] Makefile.am:32: warning: shell echo $$OS_NAME | tr [A-Z] [a-z]: non-POSIX variable name
     [exec] Makefile.am:32: (probably a GNU make extension)
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] automake: warning: possible forward-incompatibility.
     [exec] automake: At least a source file is in a subdirectory, but the 'subdir-objects'
     [exec] automake: automake option hasn't been enabled.  For now, the corresponding output
     [exec] automake: object file(s) will be placed in the top-level directory.  However,
     [exec] automake: this behaviour will change in future Automake versions: they will
     [exec] automake: unconditionally cause object files to be placed in the same subdirectory
     [exec] automake: of the corresponding sources.
     [exec] automake: You are advised to start using 'subdir-objects' option throughout your
     [exec] automake: project, to avoid future incompatibilities.
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/security/getGroup.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/security/JniBasedUnixGroupsMapping.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/nativeio/file_descriptor.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/nativeio/errno_enum.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am:43: warning: source file 'src/org/apache/hadoop/io/nativeio/NativeIO.c' is in a subdirectory,
     [exec] Makefile.am:43: but option 'subdir-objects' is disabled
     [exec] Makefile.am: installing 'config/depcomp'

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivy-2.1.0.jar
      [get] Not modified - so not downloaded

ivy-init-dirs:
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy/lib
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy/report

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivysettings.xml

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivysettings.xml

init:
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/tools
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/src
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/task/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/job/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/history/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/hdfs/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/datanode/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/secondary/WEB-INF
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/examples
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ant
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/c++
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/classes
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/testjar
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/testshell
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/extraconf
    [touch] Creating /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/null1998253588
   [delete] Deleting: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/null1998253588
     [copy] Copying 9 files to /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps
     [exec] svn: E155007: '/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1' is not a working copy
     [exec] svn: E155007: '/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1' is not a working copy
     [exec] src/saveVersion.sh: line 36: md5sum: command not found
     [exec] xargs: md5sum: No such file or directory

record-parser:

compile-rcc-compiler:
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:472: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 29 source files to /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
    [javac] warning: [options] bootstrap class path not set in conjunction with -source 1.6
    [javac] 1 warning

compile-core-classes:
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:496: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 446 source files to /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
    [javac] warning: [options] bootstrap class path not set in conjunction with -source 1.6
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:51: warning: ResolverConfiguration is internal proprietary API and may be removed in a future release
    [javac] import sun.net.dns.ResolverConfiguration;
    [javac]                   ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:52: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
    [javac] import sun.net.util.IPAddressUtil;
    [javac]                    ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/http/HttpServer.java:248: warning: [unchecked] unchecked call to put(K,V) as a member of the raw type Map
    [javac]         logContext.getInitParams().put(
    [javac]                                       ^
    [javac]   where K,V are type-variables:
    [javac]     K extends Object declared in interface Map
    [javac]     V extends Object declared in interface Map
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:493: warning: ResolverConfiguration is internal proprietary API and may be removed in a future release
    [javac]         ResolverConfiguration.open().searchlist();
    [javac]         ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:510: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
    [javac]       if (IPAddressUtil.isIPv4LiteralAddress(host)) {
    [javac]           ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:512: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
    [javac]         byte[] ip = IPAddressUtil.textToNumericFormatV4(host);
    [javac]                     ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:514: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
    [javac]       } else if (IPAddressUtil.isIPv6LiteralAddress(host)) {
    [javac]                  ^
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:516: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
    [javac]         byte[] ip = IPAddressUtil.textToNumericFormatV6(host);
    [javac]                     ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 9 warnings
    [javac] Creating empty /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes/org/apache/hadoop/jmx/package-info.class
     [copy] Copying 1 file to /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes

compile-core-native:
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/lib
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/snappy
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/nativeio
    [mkdir] Created dir: /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/security
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionHeader.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionStrategy.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionLevel.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor_CompressionHeader.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/snappy/org_apache_hadoop_io_compress_snappy_SnappyCompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/snappy/org_apache_hadoop_io_compress_snappy_SnappyDecompressor.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO_Stat.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO_CachedUid.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsMapping.h]]
    [javah] [Forcefully writing file RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native/Mac_OS_X-x86_64-64/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]]
     [exec] checking for gcc... /usr/bin/clang
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables... 
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether /usr/bin/clang accepts -g... yes
     [exec] checking for /usr/bin/clang option to accept ISO C89... none needed
     [exec] checking whether /usr/bin/clang understands -c and -o together... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking how to run the C preprocessor... /usr/bin/clang -E
     [exec] checking for grep that handles long lines and -e... /usr/bin/grep
     [exec] checking for egrep... /usr/bin/grep -E
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking minix/config.h usability... no
     [exec] checking minix/config.h presence... no
     [exec] checking for minix/config.h... no
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for a BSD-compatible install... /opt/local/bin/ginstall -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... /opt/local/bin/gmkdir -p
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] checking whether make supports nested variables... yes
     [exec] checking dependency style of /usr/bin/clang... gcc3
     [exec] checking for gcc... (cached) /usr/bin/clang
     [exec] checking whether we are using the GNU C compiler... (cached) yes
     [exec] checking whether /usr/bin/clang accepts -g... (cached) yes
     [exec] checking for /usr/bin/clang option to accept ISO C89... (cached) none needed
     [exec] checking whether /usr/bin/clang understands -c and -o together... (cached) yes
     [exec] checking build system type... x86_64-apple-darwin16.4.0
     [exec] checking host system type... x86_64-apple-darwin16.4.0
     [exec] checking how to print strings... printf
     [exec] checking for a sed that does not truncate output... /opt/local/bin/gsed
     [exec] checking for fgrep... /usr/bin/grep -F
     [exec] checking for ld used by /usr/bin/clang... /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld
     [exec] checking if the linker (/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld) is GNU ld... no
     [exec] checking for BSD- or MS-compatible name lister (nm)... /opt/local/bin/nm -B
     [exec] checking the name lister (/opt/local/bin/nm -B) interface... BSD nm
     [exec] checking whether ln -s works... yes
     [exec] checking the maximum length of command line arguments... 196608
     [exec] checking how to convert x86_64-apple-darwin16.4.0 file names to x86_64-apple-darwin16.4.0 format... func_convert_file_noop
     [exec] checking how to convert x86_64-apple-darwin16.4.0 file names to toolchain format... func_convert_file_noop
     [exec] checking for /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld option to reload object files... -r
     [exec] checking for objdump... objdump
     [exec] checking how to recognize dependent libraries... pass_all
     [exec] checking for dlltool... no
     [exec] checking how to associate runtime and link libraries... printf %s\n
     [exec] checking for ar... ar
     [exec] checking for archiver @FILE support... no
     [exec] checking for strip... strip
     [exec] checking for ranlib... ranlib
     [exec] checking command to parse /opt/local/bin/nm -B output from /usr/bin/clang object... ok
     [exec] checking for sysroot... no
     [exec] checking for a working dd... /bin/dd
     [exec] checking how to truncate binary pipes... /bin/dd bs=4096 count=1
     [exec] checking for mt... no
     [exec] checking if : is a manifest tool... no
     [exec] checking for dsymutil... dsymutil
     [exec] checking for nmedit... nmedit
     [exec] checking for lipo... lipo
     [exec] checking for otool... otool
     [exec] checking for otool64... no
     [exec] checking for -single_module linker flag... yes
     [exec] checking for -exported_symbols_list linker flag... yes
     [exec] checking for -force_load linker flag... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking for objdir... .libs
     [exec] checking if /usr/bin/clang supports -fno-rtti -fno-exceptions... yes
     [exec] checking for /usr/bin/clang option to produce PIC... -fno-common -DPIC
     [exec] checking if /usr/bin/clang PIC flag -fno-common -DPIC works... yes
     [exec] checking if /usr/bin/clang static flag -static works... no
     [exec] checking if /usr/bin/clang supports -c -o file.o... yes
     [exec] checking if /usr/bin/clang supports -c -o file.o... (cached) yes
     [exec] checking whether the /usr/bin/clang linker (/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... darwin16.4.0 dyld
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] checking for dlopen in -ldl... yes
     [exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
     [exec] checking for ANSI C header files... (cached) yes
     [exec] checking stdio.h usability... yes
     [exec] checking stdio.h presence... yes
     [exec] checking for stdio.h... yes
     [exec] checking stddef.h usability... yes
     [exec] checking stddef.h presence... yes
     [exec] checking for stddef.h... yes
     [exec] checking jni.h usability... yes
     [exec] checking jni.h presence... yes
     [exec] checking for jni.h... yes
     [exec] checking zlib.h usability... yes
     [exec] checking zlib.h presence... yes
     [exec] checking for zlib.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for '-lz'... 
     [exec] checking zconf.h usability... yes
     [exec] checking zconf.h presence... yes
     [exec] checking for zconf.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for '-lz'... (cached) 
     [exec] checking snappy-c.h usability... yes
     [exec] checking snappy-c.h presence... yes
     [exec] checking for snappy-c.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for '-lsnappy'... libnotfound.so
     [exec] checking fcntl.h usability... yes
     [exec] checking fcntl.h presence... yes
     [exec] checking for fcntl.h... yes
     [exec] checking for stdlib.h... (cached) yes
     [exec] checking for string.h... (cached) yes
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for fcntl.h... (cached) yes
     [exec] checking for posix_fadvise... no
     [exec] checking for fcntl.h... (cached) yes
     [exec] checking for sync_file_range... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for memset... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... no
     [exec] checking that generated files are newer than configure... done
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating config.h
     [exec] config.status: executing depfiles commands
     [exec] config.status: executing libtool commands
     [exec] /Applications/Xcode.app/Contents/Developer/usr/bin/make  all-am
     [exec] /bin/sh ./libtool  --tag=CC   --mode=compile /usr/bin/clang -DHAVE_CONFIG_H -I. -I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native  -I/System/Library/Frameworks/JavaVM.framework/Headers -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/io/compress/snappy -Isrc/org/apache/hadoop/io/nativeio -Isrc/org/apache/hadoop/security -I/opt/local/include -I/System/Library/Frameworks/JavaVM.framework/Headers -I/System/Library/Frameworks/JavaVM.framework/Headers -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -g -Wall -fPIC -O2 -m64 -Os -arch x86_64 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo '/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
     [exec] libtool: compile:  /usr/bin/clang -DHAVE_CONFIG_H -I. -I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native -I/System/Library/Frameworks/JavaVM.framework/Headers -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/io/compress/snappy -Isrc/org/apache/hadoop/io/nativeio -Isrc/org/apache/hadoop/security -I/opt/local/include -I/System/Library/Frameworks/JavaVM.framework/Headers -I/System/Library/Frameworks/JavaVM.framework/Headers -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -g -Wall -fPIC -O2 -m64 -Os -arch x86_64 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c  -fno-common -DPIC -o .libs/ZlibCompressor.o
     [exec] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:71:41: error: expected expression
     [exec]         void *libz = dlopen(HADOOP_ZLIB_LIBRARY, RTLD_LAZY | RTLD_GLOBAL);
     [exec]                                                ^
     [exec] 1 error generated.
     [exec] make[1]: *** [ZlibCompressor.lo] Error 1
     [exec] make: *** [all] Error 2

BUILD FAILED
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:627: The following error occurred while executing this line:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:707: exec returned: 2

Total time: 1 minute 27 seconds

sh-3.2# 

comment:23 Changed 8 years ago by kencu (Ken)

this might be worth looking at:

<http://stackoverflow.com/questions/10382929/how-to-fix-java-lang-unsupportedclassversionerror-unsupported-major-minor-versi>

Also - you've moved into a more specific zlib error that might able to be overcome, in that last attempt. I think that's progress...

Last edited 8 years ago by kencu (Ken) (previous) (diff)
Note: See TracTickets for help on using tickets.