Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

About the meeting

The ALICE-SHARK User Meeting 2022 (ASUM2022) took place on 18 May 2022 from 9:00 – 13:00 in a hybrid format. Online participation was facilitated through a dedicated Microsoft Teams team (“ALICE-SHARK User Meeting 2022) to which remote and on-sight attendees were added. The on-sight part took place in room 1A01 of the Pieter-de-la-Court building. In addition, a mailing list was created to contact all participants without using Teams

Registration was mandatory and done via e-mail. The total number of registered participants was 46 (26 from LEI, 19 from LUMC, 1 third-party organization) with 22 participants registering for remote attendance and 24 in-person. A total of 34 participants joined the live stream including a few on-sight participants. While we did not count the number of actual participants during the meeting, more participants joined online than anticipated and we would estimate the total number of participants in the end to be about 40.

About the survey

Following ASUM2022, we send out a short survey to all participants for feedback. The survey was created with Microsoft Forms and it was conducted anonymously.

Summary of survey responses

A total of 12 participants filled out the form which corresponds to a response rate of 30% (assuming 40 participants in total). Average complete time was 5m19s. The survey participants are equally split between remote and in-person meeting attendees and almost equally split between LEI (7) and LUMC (5) users. We did not ask to specify whether participants use ALICE or SHARK.

Overall response to the meeting is positive (3.92 of 5) including the hybrid setup (4.08), the use of Microsoft Teams (3.91 of 5) and the live stream (4.67). Also, the Q&A session was well received (4.17). The majority of responses recommend to repeat this meeting on a yearly basis.

The survey participants liked in particular the practical information from the talks and sharing of user experiences. In addition, updates on the cluster, the Q&A session and the direct interaction with the support teams was great.

However, there are also several aspects for improvements of the meeting. This includes that talks should focus more on practical/technical content instead of scientific content, more practical examples or presentation of tools. In addition, the scheduling, time management and the hybrid setup of the meeting should be improved. More details on future developments for the cluster would also be nice.

More general suggestions include the creation of a user platform for better user interaction and workshops on using the cluster better.

Conclusions from the survey

Bringing the user communities of ALICE and SHARK together in this manner was well received. We will continue with this joined user meeting on a yearly basis with a hybrid setup. Whether the next meeting will be recorded will be decided by the organizing committee at that time.

It is important that the talks reflect the diversity of the work that is being done on the clusters. However, a stronger focus on highlighting practical/technical aspects of using the clusters is advisable. The support team can also present information and plans for the clusters in more detail.

The Q&A session will be kept for future meetings.

While the length of the meeting was sufficient, the schedule was too dense for the amount of time available. A fewer number of talks is advisable.

Survey Responses

The following is a summary of the responses to each question of the survey.

Please specify your organization

Organization

Count

Percentage

LEI

7

58%

LUMC

5

42%

 Please specify your institute (Leiden University) or department (LUMC)

Organization

Institute/Department

Count

LEI

IBL

2

 

MI

2

 

Psychology

2

 

STRW

1

LUMC

Department of Biomedical Data Science, SASC

1

 

Human Genetics

1

 

LKEB

2

 

Radiotherapy

1

Did you attend the meeting?

Options

Count

Percentage

Yes

9

75%

Partially

3

25%

No, I did not attend at all

0

0%

Did you attend the meeting in-person or online?

The number of responses was evenly split between online and remote participants.

Options

Count

Percentage

Remote

6

50%

Online

6

50%

Rating questions

The following rating questions are on a scale of 0 till 5 with 5 being the best value.

Question

#Responses

Average Rating

How would you rate the meeting overall?

12

3.92

How would you rate the hybrid setup of the meeting?

12

4.08

How would you rate the quality of the live stream?*

6

4.67

*The question on the quality of the live stream was only available to those who attended online (question 7).

Question on usefulness of various aspects of the meeting.

The options for the following questions were text based, ranging from “Extremely not useful” to “Extremely useful”. Here, the options were converted into a rating on a range of1 (“Extremely not useful”) to 5 (“Extremely useful”).

Question

#Responses

Average Rating

How useful did you find the Teams space created for the event?

12

3.91

How useful did you find the chat functionality in Teams?*

10

4.50

How useful did you find the Q&A session of the meeting?

12

4.17

*This question had an additional option of “Not applicable” for those who did not make use of the chat functionality. This option was selected by 2 participants.

Would it be useful to record the meeting in the future?

Interestingly, the responses give not a clear indication towards recording the meeting though online attendees were more in favor of recording the meeting.

Options

Count

Percentage

Yes

6

50%

No

2

17%

Maybe

4

33%

Online attendees answered the following: 4 Yes, 1 No, 1 Maybe
In-person attendees answered the following: 2 Yes, 1 No, 3 Maybe

How often should we host a user meeting like this?

Options

Count

Percentage

Once per year

8

67%

Every 6 month

3

25%

I don’t know

1

8%

Other

0

0%

What did you like most about the meeting?

Total number of responses: 10

Instead of listing each response individual, the following lists individual items. The number in brackets corresponds to how often this item was mentioned in the responses:

  • Practical examples/tips and tricks on using the clusters (6)

  • Sharing user experience (5)

  • Getting updates on the clusters (2)

  • Q&A Session (2)

  • Information about future plans for the clusters (1)

  • Variety of talks with some focusing on practical aspects others on what is done on the cluster (1)

  • Meeting support team in-person (1)

  • Meeting other users (1)

  • Coffee, Lemonade, Oreos (1)

What did you like the least about the meeting or what was missing?

Total number of responses: 8

Instead of listing each response individual, the following list summarizes the responses, which is why the survey responses are not quoted directly. The number in brackets corresponds to how often this item was mentioned in the responses:

  • On-sight attendees could not see online attendees

  • A common community platform exchange with other users

  • More attention to the setup of the hybrid meeting, e.g., questions in the room were not repeated closed to the microphone, side talks should not be done close to the microphone, perhaps mute the camera’s microphone at times

  • The meeting could have been shorter

  • Talks outside of my own field were interesting but not useful

  • Some talks focused too much on science/research

  • A bit more explanation about future plans

  • Nothing

If you have any recommendations for future meetings, please provide them here. This can concern for example the scope of the program, suggestions for talks, the balance of scientific content versus practical/technical usage talks, duration of talks/meeting, location, logistics, discussions, meeting announcements, etc. You can also provide here additional feedback on your answers to questions in this survey.

Total number of responses: 8

Instead of listing each response individual, the following list summarizes the responses, which is why the survey responses are not quoted directly. The number in brackets corresponds to how often this item was mentioned in the responses:

  • Common user platform for users of ALICE and SHARK (2)

  • Or a regular in-person user group meeting

  • Duration of talks and location was fine

  • Talks should be more technically focused (4)

  • More practical information or tool introductions (6)

  • Tips and tricks from the users were great

  • Workshops on using ALICE better

  • No labels