Skip to main content
Hitachi Vantara Knowledge

How to Collect Data for HCP for Cloud Scale

 

Objective

This article covers data collection from HCP for Cloud Scale. If you experience an issue, use the procedures in this section to obtain required data for GSC to analyze and fix your problem.

Environment

  • HCP for Cloud Scale (HCP-CS)
    • 2.6.x and below

Procedure

  1. Collect diagnostic information for HCP for Cloud scale

This tool is located at this path on each HCP-CS instance:

/<hcpcs-installation-path>/bin/log_download

e.g  /opt/hcpcs/bin/log_download

To gather logs for the past 24 hours, execute:

/<hcpcs-installation-path>/bin/log_download -d -l

To gather logs for a specific timeframe, you can use the -t option

/<hcpcs-installation-path>/bin/log_download -d -l -t yyyy-MM-dd,yyyy-MM-dd

For information on running the tool, run:

/<hcpcs-installation-path>/bin/log_download -h

When using the log_download script, if you specify the --output option, do not specify an output path that contains colons, spaces, or symbolic links. If you omit the --output option, you cannot run the script from within a directory path that contains colons, spaces, or symbolic links.

When you run the log_download script, all log files are automatically compressed and moved to the retired/ directory

Additional Notes

If an instance is down, you need to specify the option --offline to collect the logs from that instance. If your whole system is down, you need to run the script log_download with the option --offline on each instance.

 

  • Was this article helpful?