A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit <a rel="external noopener" href="https://ecs.victoria.ac.nz/foswiki/pub/Main/TechnicalReportSeries/ECSTR10-12.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
Users are often at best an afterthought among software developerseven when they themselves are the users! Usability of programming languages and tools is all too often equated with raw functionality or casually dismissed as a matter of mere syntactic sugar. In one sense the usability issues for programming languages and tools are nothing special. Whether its a UML modeling tool or a Web 2.0 ERP system, on the other side of the screen is a human eye and hand coordinated by a human brain. The<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/nb4ev2loh5cnzf4um3iisgzxem">fatcat:nb4ev2loh5cnzf4um3iisgzxem</a> </span>
more »... broad principles and specific techniques of sound interaction design apply. This keynote by an award-winning interaction designer and design methodologist attempts to frame the issues in the usability of the tools we use, exploring the dimensions of user experience in programming languages and tools, as well as examining what might be unique or special about our experience as users. Specific recommendations and proposals for improving the usability and user experience of languages and tools are presented. Bio Larry Constantine, IDSA, is a Professor in the Department of Mathematics and Engineering at the University of Madeira where he teaches in the dual-degree program that he helped organize with Carnegie-Mellon University in the United States. He heads the Laboratory for Usage-centered Software Engineering, a research and development group dedicated to making technology more useful and usable. One of the pioneers of modern software engineering, he is an award-winning designer and author, recipient of the 2009 Stevens Award for his contributions to design and design methods, and a Fellow of the Association for Computing Machinery. Abstract Software engineering practices and tools have had a significant impact on productivity, time to market, comprehension, maintenance and evolution of software in general. Low-level systems have been largely overlooked in this arena, partially due to the complexities they offer and partially due to the inherent "bare bones" nature of the domain. The fact is that anyone can understand a few lines of assembly, even hundreds, but when you move into the tens of thousands or more, most people will require additional cognitive support. This lends an opportunity to explore the application of state-ofthe-art high-level theories to low-level practice. Our initial investigations indicate that there are real issues that even experienced developers face, such as the overwhelming size but also obfuscation of system function in malware. We believe modern tools can help in this domain, and this paper explores the ways in which we believe visualization will be of particular importance. Abstract Static analysis tools have achieved great success in recent years in automating the process of detecting defects in software. However, these sophisticated tools have yet to gain widespread adoption, since many of these tools remain too difficult to understand and use. In previous work, we discovered that even with an effective code visualization tool, users still found it hard to determine if warnings reported by these tools were true errors or false warnings. The fundamental problem users face is to understand enough of the underlying algorithm to determine if a warning is caused by imprecision in the algorithm, a challenge that even experts with PhDs may take a while to achieve. In our current work, we propose to use triaging checklists to provide users with systematic guidance to identify false warnings by taking into account specific sources of imprecision in the particular tool. Additionally, we plan to provide checklist assistants, which is a library of simple analyses designed to aid users in answering checklist questions. Abstract Evaluating the usability of a programming language or tool requires a number of pieces to fall into place. We raise issues along the path from study design to implementation and analysis drawn from the experience of running several studies concerned with a new parallel programming language -X10. We summarize several analyses that can be drawn from different aspects of the same data. Abstract The Ruby programming language is designed for easy use. The usability is an important feature since the productivity of programmers depends on it. This paper describes that the design method obtained through the experiences of developing Ruby. The design method can be used to make other languages and libraries easy to use. Abstract The dominant paradigm of concurrent programming has well-publicized usability problems, but the alternatives have not been well analyzed from a usability perspective. I attempted an empirical comparison of programmer productivity using the Actor model, transactional memory, and traditional lock-based concurrency paradigms. The results were inconclusive. I discuss my experiment, present its results, and discuss possible reasons why such experiments are a blunt tool with which to investigate programming language usability.
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20180514213400/https://ecs.victoria.ac.nz/foswiki/pub/Main/TechnicalReportSeries/ECSTR10-12.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5a/62/5a62e2e7364fb5a44f4f4d1812401bbfc8401593.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a>