BigGreen was made possible in part by a National Science Foundation grant that funds “Cyberinfrastructure for Transformational Scientific Discovery in West Virginia and Arkansas (CI-TRAIN),” a partnership among eight higher education institutions in West Virginia and Arkansas.
In 2015, the Campus Bridging team at Extreme Science and Engineering Discovery Environment (XSEDE) assisted rebuilding the cluster with XSEDE-Compatible Basic Cluster (XCBC).
The CI-TRAIN project is a partnership of institutions of higher education to transform the practice of information technology services for enabling scientific discovery. The CI-TRAIN project was founded by institutions in Arkansas and West Virginia in a partnership that builds on common research in nanoscience and geosciences and leverages complementary expertise.
Institutions of higher education in Arkansas and West Virginia and Oklahoma have partnered to build advanced cyberinfrastructure that is advancing science in several domains and transforming the practice of information technology services.
Click here for a detailed schedule of CI-Train seminars.
For more information please visit CI-Train.org.
Information Technology staff assist and provide support for faculty, students and researchers in different areas of research computing including:
- Grant proposal preparations
- Data Management plans
- Hardware and software acquisition specifications
- Vendor communication and pricing
- Hardware setup and software installation
- Server and Storage Hosting
- Virtual Machines Hosting
- Performance tuning and monitoring
- Linux and Windows systems administration and configuration
Marshall University has joined Internet2®, the advanced networking consortium. Through its membership, the university’s students, faculty and staff will have access to Internet2’s premier, ultrafast nationwide network which connects research and educational institutions in the U.S. and interconnects with international research networks worldwide.
Marshall’s connection to the Internet2 Network is made possible through a partnership with OARnet, Ohio’s statewide research and education network, and Merit, Michigan’s statewide research and education network and an Internet2 Connector organization.
Internet2 is the foremost U.S. advanced networking consortium. Led by the research and education community since 1996, Internet2 promotes the missions of its members by providing both leading-edge network capabilities and unique partnership opportunities that together facilitate the development, deployment and use of revolutionary Internet technologies. By bringing research and academia together with technology leaders from industry, government and the international community, Internet2 promotes collaboration and innovation that has a fundamental impact on the future of the Internet.
For more information about Internet2, please visit their website by Clicking here.
The Huntington Campus network is linked by a university-owned metro fiber point to point service to the Robert C. Byrd Center for Flexible Manufacturing, MU Visual Arts Center, and Department of Dietetics facilities located in downtown Huntington.
An FCC grant-funded project was completed in 2011 to extend the current Huntington metro fiber network to St. Mary’s Medical Center, St. Mary’s Medical Center Education Center, Cabell Huntington Hospital, the Marshall University Medical Center, the JCESOM Fairfield campus, and the Marshall University Robert C. Byrd Biotechnology Science Center, and MUnet with two redundant 10 Gb Ethernet rings. These metro Ethernet rings provide the bandwidth and redundancy needed to enable the next-generation medical and collaboration technologies.
The MUnet campus networks are connected via 10Gb dual diverse path connections through our Internet Service Provider (ISP), the Ohio Academic and Research Network (OARnet). Marshall University is also a member of Internet2 and is connected to Internet2 with 7Gb of service. A total of 3Gb commodity Internet Service is currently being provided to MUnet subscribers. This bandwidth and redundancy provide the reliability and services needed to support current campus initiatives.
All MUnet services provide full Quality of Service (QoS) on all network ports and multicasting in support of voice, data, and video services and other real time applications. All services are switched and operate at full wire speeds.
MUnet supports full Voice over IP, VoIP, telephony services with unified communications and voice mail to over 4,000 extensions and a limited number of FAX and other analog lines via analog gateways.
MUnet central video conferencing services support full High Definition (HD) conferencing at 720p or 1080p. All HD endpoints are capable of a four way video call. Support for video calls with more than four concurrent endpoints is provided by a 20 port Multi Point Control units supporting full HD.
Web Conferencing for virtual classrooms is provided by the Microsoft Teams. This service provides a full virtual classroom experiences with student breakout rooms, lecture recording/archiving, and poll/question/quizzing during on-demand archived sessions. Rooms are also available for campus meetings and other event functions.
Marshall University Information Technology in collaboration with researchers and scientists are in the process of building an optimized network scientific research and data transfer funded by NSF CC*DNI award.
The objectives of this project are:
-
- Improve research data flows and remove any data flow constraints.
- Aggregate network traffic from other network domains.
- Improve high-speed access to online scientific applications and data generated at MU, other institutions, and federal agencies
- Cisco 10 G switches at the endpoint buildings to provide 10G access to researchers’ workstations.
- Cisco switches with 40G capabilities to provide 40G backbone and support the 10g endpoint switches.
- Dedicated and independent 1GE uplink to Internet2 and OARnet
- Implement a science DMZ within the campus network infrastructure to allow the trusted data to travel outside the firewall.
- Find bottlenecks and optimize the network for high-volume transfer of datasets
- Utilize perfSONAR for performance Measurement
- Implement switches supporting SDN
- Support OpenFlow-based architectures, experimentally and in operations.
- Install Data Transfer Node
- Improve Identity Management and Authentication
- Fully implement InCommon over the next year to eighteen months.
- Implement CILogon
- Implement Eduroam
- Access training for key IT staff members for new CI tools (perfSONAR, InCommon, CILogon, Eduroam, etc.
- Continue IPv6 implementation for the entire university computing facilities.
- Selective on-premise deployment to only the services and software that are good fit for local hosting while extending other services to other locations (i.e. Big Data Analytics locally, REDCap remotely or local HPC as a staging resource before going to XSEDE).
- Promote and demonstrate cyber-infrastructure initiatives like XSEDE to the campus research community.
- Develop closer partnerships with other institutions’ IT research computing teams and coordinate hosting meetings and workshops.
- Improve high-speed access to online scientific applications and data generated at MU, other institutions and federal agencies.
- Improve research data flows and remove any data flow constraints.
Click the images below to expand.
News
Research Computing Advisory Committee
![]() Dr. John Maher |
![]() Dr. Charles Somerville |
![]() Dr. Alfred Cecchetti |
![]() Dr. Anthony Szwilski |
![]() Dr. Michael W. Prewitt |
|
![]() Dr. Jack Smith |
![]() Dr. Wael Zatar |