Back to Multiple platform build/check report for BioC 3.19: simplified long |
|
This page was generated on 2024-05-09 11:40:36 -0400 (Thu, 09 May 2024).
Hostname | OS | Arch (*) | R version | Installed pkgs |
---|---|---|---|---|
nebbiolo1 | Linux (Ubuntu 22.04.3 LTS) | x86_64 | 4.4.0 (2024-04-24) -- "Puppy Cup" | 4748 |
palomino3 | Windows Server 2022 Datacenter | x64 | 4.4.0 (2024-04-24 ucrt) -- "Puppy Cup" | 4484 |
lconway | macOS 12.7.1 Monterey | x86_64 | 4.4.0 (2024-04-24) -- "Puppy Cup" | 4514 |
kunpeng2 | Linux (openEuler 22.03 LTS-SP1) | aarch64 | 4.4.0 beta (2024-04-15 r86425) -- "Puppy Cup" | 4480 |
Click on any hostname to see more info about the system (e.g. compilers) (*) as reported by 'uname -p', except on Windows and Mac OS X |
Package 182/2300 | Hostname | OS / Arch | INSTALL | BUILD | CHECK | BUILD BIN | ||||||||
BiocHail 1.4.0 (landing page) Vincent Carey
| nebbiolo1 | Linux (Ubuntu 22.04.3 LTS) / x86_64 | OK | OK | OK | ![]() | ||||||||
palomino3 | Windows Server 2022 Datacenter / x64 | ... NOT SUPPORTED ... | ||||||||||||
lconway | macOS 12.7.1 Monterey / x86_64 | ... NOT SUPPORTED ... | ||||||||||||
kunpeng2 | Linux (openEuler 22.03 LTS-SP1) / aarch64 | OK | ERROR | skipped | ||||||||||
kjohnson3 | macOS 13.6.5 Ventura / arm64 | see weekly results here | ||||||||||||
To the developers/maintainers of the BiocHail package: - Allow up to 24 hours (and sometimes 48 hours) for your latest push to git@git.bioconductor.org:packages/BiocHail.git to reflect on this report. See Troubleshooting Build Report for more information. - Use the following Renviron settings to reproduce errors and warnings. - If 'R CMD check' started to fail recently on the Linux builder(s) over a missing dependency, add the missing dependency to 'Suggests:' in your DESCRIPTION file. See Renviron.bioc for more information. - See Martin Grigorov's blog post for how to debug Linux ARM64 related issues on a x86_64 host. |
Package: BiocHail |
Version: 1.4.0 |
Command: /home/biocbuild/R/R-beta-2024-04-15_r86425/bin/R CMD build --keep-empty-dirs --no-resave-data BiocHail |
StartedAt: 2024-05-08 22:43:48 -0000 (Wed, 08 May 2024) |
EndedAt: 2024-05-08 22:44:20 -0000 (Wed, 08 May 2024) |
EllapsedTime: 32.0 seconds |
RetCode: 1 |
Status: ERROR |
PackageFile: None |
PackageFileSize: NA |
############################################################################## ############################################################################## ### ### Running command: ### ### /home/biocbuild/R/R-beta-2024-04-15_r86425/bin/R CMD build --keep-empty-dirs --no-resave-data BiocHail ### ############################################################################## ############################################################################## * checking for file ‘BiocHail/DESCRIPTION’ ... OK * preparing ‘BiocHail’: * checking DESCRIPTION meta-information ... OK * installing the package to build vignettes * creating vignettes ... ERROR --- re-building ‘gwas_tut.Rmd’ using rmarkdown --- finished re-building ‘gwas_tut.Rmd’ --- re-building ‘large_t2t.Rmd’ using rmarkdown 2024-05-08 22:44:15.865 WARN NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2024-05-08 22:44:17.052 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.056 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.060 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.064 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.068 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.072 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.076 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.081 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.085 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.090 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.095 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.131 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.135 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.140 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.144 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.150 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.160 ERROR SparkContext:94 - Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at java.base/sun.nio.ch.Net.bind0(Native Method) at java.base/sun.nio.ch.Net.bind(Net.java:555) at java.base/sun.nio.ch.ServerSocketChannelImpl.netBind(ServerSocketChannelImpl.java:337) at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:294) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:833) Initializing Hail with default parameters... 2024-05-08 22:44:17.230 WARN SparkContext:69 - Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:85) is.hail.backend.spark.SparkBackend$.configureAndCreateSparkContext(SparkBackend.scala:148) is.hail.backend.spark.SparkBackend$.apply(SparkBackend.scala:230) is.hail.backend.spark.SparkBackend.apply(SparkBackend.scala) java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.base/java.lang.reflect.Method.invoke(Method.java:568) py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) py4j.Gateway.invoke(Gateway.java:282) py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) py4j.commands.CallCommand.execute(CallCommand.java:79) py4j.GatewayConnection.run(GatewayConnection.java:238) java.base/java.lang.Thread.run(Thread.java:833) 2024-05-08 22:44:17.265 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.269 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.273 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.277 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.281 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.285 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.289 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.293 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.297 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.301 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.306 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.310 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.313 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.317 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.321 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.324 WARN Utils:69 - Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 2024-05-08 22:44:17.329 ERROR SparkContext:94 - Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at java.base/sun.nio.ch.Net.bind0(Native Method) at java.base/sun.nio.ch.Net.bind(Net.java:555) at java.base/sun.nio.ch.ServerSocketChannelImpl.netBind(ServerSocketChannelImpl.java:337) at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:294) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:562) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:260) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:356) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:833) Quitting from lines at lines 69-79 [do17] (large_t2t.Rmd) Error: processing vignette 'large_t2t.Rmd' failed with diagnostics: py4j.protocol.Py4JJavaError: An error occurred while calling z:is.hail.backend.spark.SparkBackend.apply. : java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. <...truncated...>Executor.java:164) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:833) Run `reticulate::py_last_error()` for details. --- failed re-building ‘large_t2t.Rmd’ --- re-building ‘ukbb.Rmd’ using rmarkdown --- finished re-building ‘ukbb.Rmd’ SUMMARY: processing the following file failed: ‘large_t2t.Rmd’ Error: Vignette re-building failed. Execution halted