Git Product home page Git Product logo

Comments (22)

danielcweeks avatar danielcweeks commented on July 20, 2024

The default ports are based on EMR, which is not the hadoop default.

Just change the ports in those scripts to whatever you're running your RM on.

from inviso.

taoran34 avatar taoran34 commented on July 20, 2024

Thanks a lot! I have changed the ports and it works.
But it does not work so well when I search for job information.

I search a user's job information, but the "Stop", "Duration", "Links", "Workflow ID" and "Genie Name" information is not showing up. Moreover, the "Job Status" shows "UNKNOWN".
image

Dose it have any connection with my hadoop configuration?
Which file finish the work of gathering jobs' information?

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

So, that means the the job history isn't loading properly and just the job config information is available. You can check the logs from the tomcat server (catalina.out and inviso.log) to see if there are exceptions.

from inviso.

taoran34 avatar taoran34 commented on July 20, 2024

Thanks a lot!
And does it matter whether the commands in QuickStart conducted as "root" or "hadoop" ?

from inviso.

taoran34 avatar taoran34 commented on July 20, 2024

I have checked the catalina.out, and there is an exception as followings:

δΈ₯重: The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=EXECUTE, inode="/tmp":hadoop:supergroup:drwx------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5497)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5479)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:5441)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1707)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1659)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1639)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1613)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:497)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:322)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1144)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1132)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1122)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:264)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:231)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:224)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1295)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:300)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:296)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:296)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.netflix.bdp.inviso.history.TraceService.trace(TraceService.java:133)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
    at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
    at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
    at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
    at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
    at com.google.inject.servlet.DefaultFilterPipeline.dispatch(DefaultFilterPipeline.java:43)
    at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:113)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:503)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
    at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:421)
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1070)
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611)
    at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:314)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
    at java.lang.Thread.run(Thread.java:745)

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

It looks like you're running tomcat as root instead of as hadoop

user=root, access=EXECUTE, inode="/tmp":hadoop:supergroup:drwx------

You can either run tomcat as hadoop or update your tomcat/bin/setenv.sh to export the hadoop user as hadoop:

export HADOOP_USER_NAME=hadoop

This will give the REST service access to read the files from hdfs.

from inviso.

taoran34 avatar taoran34 commented on July 20, 2024

Thanks a lot!

from inviso.

Slouis61 avatar Slouis61 commented on July 20, 2024

In step 6 when I try and complete the process at:

python jes.py

I get:
ERROR:inviso.jes:[Errno 111] Connection refused
Traceback (most recent call last):
File "jes.py", line 35, in main
monitor.run()

File "/inviso/jes/inviso/monitor.py", line 295, in run for f in listing:

File "/venv/lib/python2.6/site-packages/snakebite/client.py", line 139, in ls recurse=recurse):

File "/venv/lib/python2.6/site-packages/snakebite/client.py", line 1072, in _find_items fileinfo = self._get_file_info(path)

File "/venv/lib/python2.6/site-packages/snakebite/client.py", line 1202, in _get_file_info return self.service.getFileInfo(request)

File "/venv/lib/python2.6/site-packages/snakebite/service.py", line 35, in rpc = lambda request, service=self, method=method.name: service.call(service_stub_class.dict[method], request)

File "/venv/lib/python2.6/site-packages/snakebite/service.py", line 41, in call return method(self.service, controller, request)

File "/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py", line 267, in self._StubMethod(inst, method, rpc_controller, request, callback))

File "/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py", line 284, in _StubMethod method_descriptor.output_type._concrete_class, callback)

File "/venv/lib/python2.6/site-packages/snakebite/channel.py", line 409, in CallMethod self.get_connection(self.host, self.port)

File "/venv/lib/python2.6/site-packages/snakebite/channel.py", line 211, in get_connection self.sock.connect((host, port))

File "", line 1, in connect

error: [Errno 111] Connection refused
(venv)

Any help would be greatly appreciated.. we Use Ambari.. running on port 8080 and Horton if this makes any difference. I would think this would affect the default port used by inviso?

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

It looks like the same port issue most people are having. The default port used by inviso is the port used by EMR. Take a look at the ports used by your RM and update they python script accordingly (These probably need to be pulled out into the settings file).

from inviso.

Slouis61 avatar Slouis61 commented on July 20, 2024

Daniel, Thanks so much for the prompt reply! What is interesting is that even after I reinstalled the entire package and moved it to the resource manager node locally I still get the same problem.. I also moved the port to 10004 instead of 8080 and moved 9026 to what Horton uses 8088. I added the port=8020 as you specify above.
Any suggestions would be appreciated.
Still, I get:

)[root@bcwnode10 jes]# python jes.py
ERROR:inviso.jes:[Errno 111] Connection refused
Traceback (most recent call last):
File "jes.py", line 36, in main
monitor.run()
File "/opt/inviso/inviso/jes/inviso/monitor.py", line 295, in run
for f in listing:
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py", line 139, in ls
recurse=recurse):
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py", line 1072, in _find_items
fileinfo = self._get_file_info(path)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py", line 1202, in _get_file_info
return self.service.getFileInfo(request)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/service.py", line 35, in
rpc = lambda request, service=self, method=method.name: service.call(service_stub_class.dict[method], request)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/service.py", line 41, in call
return method(self.service, controller, request)
File "/opt/inviso/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py", line 267, in
self._StubMethod(inst, method, rpc_controller, request, callback))
File "/opt/inviso/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py", line 284, in _StubMethod
method_descriptor.output_type._concrete_class, callback)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/channel.py", line 409, in CallMethod
self.get_connection(self.host, self.port)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/channel.py", line 211, in get_connection
self.sock.connect((host, port))
File "", line 1, in connect
error: [Errno 111] Connection refused
(venv)[root@bcwnode10 jes]#

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

Hmm, can you make sure you the NN is listening on localhost?

Try a 'telnet localhost 8020'. Otherwise specify host or ip in settings.py.

From the trace, I can't tell if it can't talk to the NameNode or the
DataNode.

Also take a look at this fork here(probably not current the master), which
might have the same settings:
https://github.com/mikebin/inviso/commit/ae822d4eb8b4b87574b412df7c15ee9e71619c5f

Let me know if any of that helps.

On Thu, Feb 5, 2015 at 8:52 AM, Slouis61 [email protected] wrote:

Daniel, Thanks so much for the prompt reply! What is interesting is that
even after I reinstalled the entire package and moved it to the resource
manager node locally I still get the same problem.. I also moved the port
to 10004 instead of 8080 and moved 9026 to what Horton uses 8088. I added
the port=8020 as you specify above.
Any suggestions would be appreciated.
Still, I get:

)[root@bcwnode10 jes]# python jes.py
ERROR:inviso.jes:[Errno 111] Connection refused
Traceback (most recent call last):
File "jes.py", line 36, in main
monitor.run()
File "/opt/inviso/inviso/jes/inviso/monitor.py", line 295, in run
for f in listing:
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py",
line 139, in ls
recurse=recurse):
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py",
line 1072, in

_find_items fileinfo = self.get_file_info(path) File
"/opt/inviso/venv/lib/python2.6/site-packages/snakebite/client.py", line
1202, in _get_file_info return self.service.getFileInfo(request) File
"/opt/inviso/venv/lib/python2.6/site-packages/snakebite/service.py", line
35, in rpc = lambda request, service=self, method=method.name
http://method.name: service.call(service_stub_class._dict[method],
request)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/service.py",
line 41, in call
return method(self.service, controller, request)
File
"/opt/inviso/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py",
line 267, in
self._StubMethod(inst, method, rpc_controller, request, callback))
File
"/opt/inviso/venv/lib/python2.6/site-packages/google/protobuf/service_reflection.py",
line 284, in _StubMethod
method_descriptor.output_type._concrete_class, callback)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/channel.py",
line 409, in CallMethod
self.get_connection(self.host, self.port)
File "/opt/inviso/venv/lib/python2.6/site-packages/snakebite/channel.py",
line 211, in get_connection
self.sock.connect((host, port))
File "", line 1, in connect
error: [Errno 111] Connection refused
(venv)[root@bcwnode10 jes]#

β€”
Reply to this email directly or view it on GitHub
#3 (comment).

from inviso.

Slouis61 avatar Slouis61 commented on July 20, 2024

Unfortunately no it did not work.. I will keep trying.. We love the promise of inviso hoping we can get it working with HDP 2.2. Thanks for your help. if you have any other ideas let me know.

from inviso.

Slouis61 avatar Slouis61 commented on July 20, 2024

and yes I can telent to the host IP in the settings.py and used an FQDN:
[root@bcwnode10 jes]# telnet bcwnode1.buzzanglepro.com 8020
Trying 10.0.5.130...
Connected to bcwnode1.buzzanglepro.com.

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

@Slouis61 I just pulled down the HDP 2.2 Sandbox and was able to get it all running with the following mods:

jes.py

--- a/jes/jes.py
+++ b/jes/jes.py
@@ -27,6 +27,7 @@ def main():
                                           cluster_id=cluster.id,
                                           cluster_name=cluster.name,
                                           host=cluster.host,
+                                          port=8020,
                                           publisher=publisher,
                                           elasticsearch=settings.elasticsearch))

--- a/jes/inviso/monitor.py
+++ b/jes/inviso/monitor.py
@@ -277,7 +277,7 @@ class HdfsMr2LogMonitor(ElasticSearchMonitor):
                  cluster_name,
                  host='localhost',
                  port=9000,
-                 log_path='/tmp/hadoop-yarn/staging/history/done', **kwargs):
+                 log_path='/mr-history/done', **kwargs):
         super(HdfsMr2LogMonitor, self).__init__(**kwargs)


--- a/jes/index_cluster_stats.py
+++ b/jes/index_cluster_stats.py
@@ -14,7 +14,7 @@ log = get_logger('inviso.cluster')
 es_index = 'inviso-cluster'

 def index_apps(es, cluster, info):
-    apps = requests.get('http://%s:%s/ws/v1/cluster/apps?state=RUNNING' % (cluster.host, '9026'), headers = {'ACCEPT':'application/json'}).jso
+    apps = requests.get('http://%s:%s/ws/v1/cluster/apps?state=RUNNING' % (cluster.host, '8088'), headers = {'ACCEPT':'application/json'}).jso

     if not apps:
         log.info(cluster.name + ': no applications running.')
@@ -51,7 +51,7 @@ def index_apps(es, cluster, info):
     log.debug(bulk(es, documents, stats_only=True));

 def index_metrics(es, cluster, info):
-    metrics = requests.get('http://%s:%s/ws/v1/cluster/metrics' % (cluster.host, '9026'), headers = {'ACCEPT':'application/json'}).json()['clu
+    metrics = requests.get('http://%s:%s/ws/v1/cluster/metrics' % (cluster.host, '8088'), headers = {'ACCEPT':'application/json'}).json()['clu
     metrics.update(info)

     r = es.index(index=es_index,
@@ -63,7 +63,7 @@ def index_metrics(es, cluster, info):
     log.debug(r)

 def index_scheduler(es, cluster, info):
-    scheduler = requests.get('http://%s:%s/ws/v1/cluster/scheduler' % (cluster.host, '9026'), headers = {'ACCEPT':'application/json'}).json()[
+    scheduler = requests.get('http://%s:%s/ws/v1/cluster/scheduler' % (cluster.host, '8088'), headers = {'ACCEPT':'application/json'}).json()[
     scheduler.update(info)

from inviso.

Slouis61 avatar Slouis61 commented on July 20, 2024

it seems that HDP 2.2 sets a different user for hadoop. Once set to the proper user and the proper fqdn's used with the proper ports, all went through with out a hitch.

The only issue i am having now is with the public/UI. Its a fresh install of Apache Tomcat 7
Version 7.0.57.. My java version is 1.7.0_45; I get 2 screens:

image

and on refresh I get:

image

it looks like a servlet issue?

I will add some comments to assist others with getting it running and what needs to be tweaked on non sandbox implementations as soon as it is all operational.

Again THANKS!!!

from inviso.

dbenitez avatar dbenitez commented on July 20, 2024

Does this application works with jobs ran via Tez? or just MapReduce jobs?

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

Currently only supports MR Jobs, but could be extended to support just about anything that has sufficient detail for job history. The Timeline Server can probably provide this info in Hadoop 2.6, but I haven't looked into it.

from inviso.

phodamentals avatar phodamentals commented on July 20, 2024

@Slouis61 What all did you modify to get this working properly? I have attempted to change the ports in the corresponding files to match the MRv1 service---I tried 8021, 50030 to no avail. I still get a "[Errno 32]" Broken Pipe w/my FQDN as well.

I am running Hadoop 2.0.0-cdh4.5.0.

from inviso.

phodamentals avatar phodamentals commented on July 20, 2024

:)

from inviso.

sammhho avatar sammhho commented on July 20, 2024

Hi, not sure if I should open a new issue for this. I've followed the QuickStart, changed ports as @danielcweeks shown above, applied patch spotify/snakebite@68db2f2 as suggested in #11, and now running the python scripts (in loop) gives no error.
I can see jobs in the search page, and the 2 streams in the cluster page are displaying (not sure about correctness).
inviso 2016-05-11 15-58-15

However, in the Profiler page, Diagram for Task Details and Execution Locality is not showing, and it looks like this:
inviso 2016-05-11 15-58-38
I tried "word count" and "TeraGen" and they all look the same. What is wrong and how should I debug?
I've look at the logs in Tomcat and found these:

May 10, 2016 7:12:08 PM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive /mnt/home/mhho/wrk/inviso/apache-tomcat-7.0.55/webapps/inviso#mr2#v0.war
May 10, 2016 7:12:09 PM org.apache.catalina.loader.WebappClassLoader validateJarFile
INFO: validateJarFile(/mnt/home/mhho/wrk/inviso/apache-tomcat-7.0.55/webapps/inviso#mr2#v0/WEB-INF/lib/jsp-api-2.1.jar) - jar not loaded. See Servlet Spec 3.0, section 10.7.2. Offending class: javax/el/Expression.class
May 10, 2016 7:12:09 PM org.apache.catalina.loader.WebappClassLoader validateJarFile
INFO: validateJarFile(/mnt/home/mhho/wrk/inviso/apache-tomcat-7.0.55/webapps/inviso#mr2#v0/WEB-INF/lib/servlet-api-2.5.jar) - jar not loaded. See Servlet Spec 3.0, section 10.7.2. Offending class: javax/servlet/Servlet.class
log4j:WARN No appenders could be found for logger (com.netflix.bdp.inviso.history.TraceService).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Are these relevant?

from inviso.

danielcweeks avatar danielcweeks commented on July 20, 2024

@sammhho I don't think this is a problem with tomcat or with the scripts. The fact that you can see the application bar (blue) means that the data is there. Also, the task view below has the correct scale, so it's getting the data, but the task bars don't appear to be rendering.

It might be a browser issue. Could you try Chrome?

from inviso.

sammhho avatar sammhho commented on July 20, 2024

@danielcweeks Thx for the prompt reply, tried Chrome and it does work...

from inviso.

Related Issues (13)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.