Software Engineering Institute (SEI) Podcast Series

Members of Technical Staff at the Software Engineering Institute
undefined
Jul 10, 2014 • 9min

AADL and Edgewater

In 2013, the AADL Standards meeting was held at SEI headquarters in Pittsburgh, Pa. The SEI Podcast Series team was there, and we interviewed several members of the AADL Standards Committee. This podcast is the third in a series based on these interviews. Listen on Apple Podcasts.
undefined
Jun 26, 2014 • 13min

Security and Wireless Emergency Alerts

The Wireless Emergency Alerts (WEA) service depends on information technology (IT)—computer systems and networks—to convey potentially life-saving information to the public in a timely manner. However, like other cyber-enabled services, the WEA service is susceptible to risks that may enable an attacker to disseminate unauthorized alerts or to delay, modify, or destroy valid alerts. Successful attacks on the alerting process may result in property destruction, financial loss, infrastructure disruption, injury, or death. Such attacks may damage WEA credibility to the extent that users ignore future alerts or disable alerting on their mobile devices. In this podcast, Carol Woody and Christopher Alberts discuss guidelines that they developed to ensure that the WEA service remains robust and resilient against cyber attacks. Listen on Apple Podcasts.
undefined
Jun 12, 2014 • 21min

Safety and Behavior Specification Using the Architecture Analysis and Design Language

In this podcast, Julien Delange discusses two extensions to the Architecture Analysis and Design Language: the behavior annex and the error-model annex. The behavior annex represents the functional logic of AADL components and interacts with the other system elements. SEI researchers are currently participating in the ongoing improvements of this extension of the AADL by connecting it to other analysis tools. The error model annex augments the architecture description by specifying safety concerns of the system (error propagation, error behavior, etc.). The language is the foundation of new analysis tools that provide qualitative and quantitative assessment of system safety and reliability. SEI researches have defined new tools that analyze the model and produces safety validation documents, such as the one required by safety standard such as the SAE ARP4761. Listen on Apple Podcasts.
undefined
May 29, 2014 • 15min

Applying Agile in the DoD: Sixth Principle

In this episode, the sixth in a series by Suzanne Miller and Mary Ann Lapham exploring the application of Agile principles in the Department of Defense (DoD), the two researchers discuss the application of the sixth principle,The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Listen on Apple Podcasts.
undefined
May 29, 2014 • 27min

Characterizing and Prioritizing Malicious Code

Every day, major anti-virus companies and research organizations are inundated with new malware samples. Although estimates vary, approximately 150,000 new malware strains are released each day. Not enough manpower exists to manually address the volume of new malware samples that arrive daily in analysts' queues. Malware analysts need an approach that allows them to sort samples in a fundamental way so they can assign priority to the most malicious binary files. In this podcast, Jose Morales, a malicious software researcher with the CERT Division, discusses an approach for prioritizing malware samples, helping analysts to identify the most destructive malware to examine first, based on the binary file's execution behavior and its potential impact. Related Training Malware Analysis Apprenticeship Listen on Apple Podcasts.
undefined
May 15, 2014 • 18min

Using Quality Attributes to Improve Acquisition

In the acquisition of a software-intensive system, the relationship between the software architecture and the acquisition strategy is typically not examined. Although software is increasingly important to the success of government programs, there is often little consideration given to its impact on early key program decisions. The Carnegie Mellon University Software Engineering Institute (SEI) is conducting a multi-phase research initiative aimed at answering the question: is the probability of a program's success improved through deliberately producing a program acquisition strategy and software architecture that are mutually constrained and aligned? Moreover, can we develop a method that helps government program offices produce such alignment? In this podcast, Patrick Place describes research aimed at determining how acquisition quality attributes can be expressed and used to facilitate alignment among the software architecture and acquisition strategy. Listen on Apple Podcasts.
undefined
Apr 29, 2014 • 22min

Best Practices for Trust in the Wireless Emergency Alerts Service

Trust is a key factor in the effectiveness of the Wireless Emergency Alerts (WEA) service. Alert originators at emergency management agencies must trust WEA to deliver alerts to the public in an accurate and timely manner. The public must also trust the WEA service before they will act on the alerts that they receive. Managing trust in WEA is a responsibility shared among many stakeholders who are engaged with WEA. In this podcast, Robert Ellison and Carol Woody discuss research aimed at developing recommendations for alert originators, the Federal Emergency Management Agency, commercial mobile service providers, and suppliers of message-generation software that would enhance both alert originators' trust in the WEA service and the public's trust in the alerts that they receive. Listen on Apple Podcasts.
undefined
Apr 10, 2014 • 21min

Three Variations on the V Model for System and Software Testing

The importance of verification and validation (especially testing) is a major reason that the traditional waterfall development cycle underwent a minor modification to create the V model that links early development activities to their corresponding later testing activities. In this podcast, Don Firesmith introduces three variants on the V model of system or software development that make it more useful to testers, quality engineers, and other stakeholders interested in the use of testing as a verification and validation method. Listen on Apple Podcasts.
undefined
Mar 27, 2014 • 18min

Adapting the PSP to Incorporate Verified Design by Contract

The Personal Software Process promotes the use of careful procedures during all stages of development with the aim of increasing an individual's productivity and producing high quality final products. Formal methods use the same methodological strategy as the PSP: emphasizing care in development procedures as opposed to relying on testing and debugging. They also establish the radical requirement of proving mathematically that the programs produced satisfy their specifications. Design by Contract is a technique for designing components of a software system by establishing their conditions of use and behavioral requirements in a formal language. When appropriate techniques and tools are incorporated to prove that the components satisfy the established requirements, the method is called Verified Design by Contract (VDbC). In this podcast, Bill Nichols discusses a proposal for integrating VDbC into PSP to reduce the number of defects present at the unit-esting phase, while preserving or improving productivity. The resulting adaptation of the PSP, called PSPVDC, incorporates new phases, modifies others, and adds new scripts and checklists to the infrastructure. Specifically, the phases of formal specification, formal specification review, formal specification compile, test case construct, pseudo code, pseudo code review, and proof are added. Listen on Apple Podcasts.
undefined
Mar 25, 2014 • 37min

Comparing IT Risk Assessment and Analysis Methods

Technical professionals are often called on to research, recommend, implement, and execute IT risk assessment and analysis processes. These processes provide important data used by management to responsibly grow and protect the business through good decision making for mitigating, accepting, transferring, or avoiding risk. These decisions must account for IT risks caused by emerging threats to the enterprise and vulnerabilities in the people, processes and technologies required for digital business. Which method you choose for IT risk assessment and risk analysis is far less important than ensuring that the selected methodology is operationalized and a good fit for the corporate culture. The selected approach must be able to produce output that is meaningful to management, and supporting processes must account for assumptions, documentation, and potential gaming of the system. Tools should be leveraged, where possible, to ease method adoption. In this podcast, Ben Tomhave and Erik Heidt, research directors with Gartner Technical Professionals, discuss methods for IT risk assessment and analysis and comparison factors for selecting the methods that are the best fit for your organization. Listen on Apple Podcasts.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app