Notes of the User Group Meeting on Jan 29, 2015

The theme of this meeting is to hear from the community about their feedback and identified bugs of the recently released HTRC v3.0 Beta. The portal can be accessed from https://htrc2.pti.indiana.edu/ The questions and issues discussed were mainly about the HTRC Data Capsule, which was integrated to the portal for the first time. Main questions and suggestions are as below.

 

  • About Data Capsule: For release result in secure mode in the VM, is there a file limit?
Answer: Right now there is no size limit. But if the size is big, then it may raise a red flag to a human reviewer, although we don't have a human reviewer in place yet. 
  • About Data Capsule: Is there an expiration time for the released files? 
Answer: The link for downloading results, which is sent to the user by email, will be good for 12 hours after the email is sent.  After that, the results can no longer be downloaded.
  • About Data Capsule: Matthew Wilkens reported he can't download the released file from the link sent to his email. It appears to be an error page. 
  • About Data Capsule: can't ssh into the VM. Matthew Wilkens expressed interest in using ssh.
Answer: the ssh port was closed on purpose due to security reasons. It can be opened for maintenance mode but that may confuse users
  • About Data Capsule: Matthew Wilkens tried to create a VM with 8 VCPU, but failed. When he changed it to 1, it was successful.
Answer: There is maximum 10 VCPU a user can create. It is likely that his quota is filled up so the creation fails. 
  • About Data Capsule: it seems the Data API credential in the Data Capsule is not updated. The vsm/demo.sh script returned me some error.
Update: The Data API credential has been updated after the meeting.
  • It looks like people don't want to have a more recent version of Ubuntu for the VM. They would like to have the current v12.04 version.

  • Suggestion: it would be nice to use Work Set directly in the Data Capsule, instead of working around it by downloading the volume ids from portal and then using Data API for content