David Rubinstein, Author at SD Times https://sdtimes.com/author/david-rubinstein/ Software Development News Mon, 28 Oct 2024 13:24:52 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg David Rubinstein, Author at SD Times https://sdtimes.com/author/david-rubinstein/ 32 32 OSI officially releases its definition for Open Source AI https://sdtimes.com/ai/osi-officially-releases-its-definition-for-open-source-ai/ Mon, 28 Oct 2024 12:00:15 +0000 https://sdtimes.com/?p=55910 The Open Source Initiative (OSI) today released its open source AI definition version 1.0 to clarify what constitutes open source AI. This gives the industry a standard by which to validate whether or not an AI system can be deemed Open Source AI.  The definition covers code, model, and data information, with the latter being … continue reading

The post OSI officially releases its definition for Open Source AI appeared first on SD Times.

]]>
The Open Source Initiative (OSI) today released its open source AI definition version 1.0 to clarify what constitutes open source AI. This gives the industry a standard by which to validate whether or not an AI system can be deemed Open Source AI. 

The definition covers code, model, and data information, with the latter being a contentious point due to legal and practical concerns. Mozilla, a long-time open source advocate, is partnering with OSI to promote openness in AI, advocating for transparency in AI systems.

The need to understand how AI systems work, so they can be researched, scrutinized and potentially regulated, is important to ensure the system is truly open source. Ayah Bdeir, senior strategic advisor on AI strategy at Mozilla, told SD Times on the “What the Dev?” podcast that AI systems are influenced by a number of different components – algorithms, code, hardware, data sets and more. 

As an example, she cited that there are data sets to train models, data sets to test, and data sets to fine tune, and this false sense of transparency leads organizations to claim their systems are open source. “When it comes to AI in traditional open source software, there’s a very clear separation between code that is written, a compiler that is used, and a license that is possessed. Each one of them can have an open license or a closed license and it’s very clear how each one of them applies to this concept of openness.” 

However, in AI systems, many components influence the system, Bdeir said. “This idea that if the code is open, that means their AI systems are open, which is not accurate.” This does not allow the fundamental reuse or study of the system that is required under an open source mentality, which is the actual four freedoms – use, study, modify and share, she explained.

“The open source AI definition by OSI is an attempt to put a real fine point on what open source AI is and isn’t, and how to have a checklist that checks for whether something is or isn’t, so that this ambiguity between claiming that something is open source or actually doing it is not is not there anymore,” she said. 

The debate over data information was among the most controversial in coming up with the definition, Bdeir said.  How do organizations that are training their models with proprietary data protect it from being used in open source AI? Bdeir explained there are schools of thought around data in particular. In one school of thought, the data set must be made completely open and available in its exact form for this AI system to be considered open source. “Otherwise,” she said, “you cannot replicate this AI system. You cannot look at the data itself to see what it was trained on, or what it was fine tuned on, etc. And therefore it’s not really open source.”

In another school of thought, where she said some of the more hands-on builders reside, making the data available is not realistic. “Data is governed by laws that are different in different countries. Copyright laws are different in different countries, and licenses on data are not always super clear and easy to find, and if you inadvertently or mistakenly distribute data sets that you have no rights to, you are liable legally.”

The OSI solution to this problem is to talk about data information. What OSI is requiring is data information, not the data in a data set. The wording, Bdeir said, says the organization must provide “sufficiently detailed information about the data used to train the system so that a skilled person can recreate a substantially equivalent system using the same or similar data.”

The post OSI officially releases its definition for Open Source AI appeared first on SD Times.

]]>
Opsera extends AI Code Assistant Insights for developer productivity https://sdtimes.com/ai/opsera-extends-ai-code-assistant-insights-for-developer-productivity/ Wed, 23 Oct 2024 17:19:37 +0000 https://sdtimes.com/?p=55887 DevOps platform provider Opsera today announced AI Code Assistant Insights, empowering enterprises to improve developer productivity, impact, time savings and accelerate the ROI of their investment in AI Code Assistants. “IDC research finds that on average, developers estimate a 35% increase in their productivity with the use of an AI coding assistant. However, it is … continue reading

The post Opsera extends AI Code Assistant Insights for developer productivity appeared first on SD Times.

]]>
DevOps platform provider Opsera today announced AI Code Assistant Insights, empowering enterprises
to improve developer productivity, impact, time savings and accelerate the ROI of their
investment in AI Code Assistants.

“IDC research finds that on average, developers estimate a 35% increase in their productivity
with the use of an AI coding assistant. However, it is challenging to have visibility into adoption
and measure these gains across the organization,” said Katie Norton, Research Manager,
DevSecOps at IDC. “The metrics available in Opsera’s Unified Insights should enable
organizations to demonstrate the ROI of GitHub Copilot adoption, enhancing their ability to track
and quantify productivity improvements.”

For enterprises looking to proactively measure the increase in ROI of their AI Code Assistant
investments and improve productivity across all software delivery tools, teams, and
environments, the new AI Code Assistant Insights in the Opsera Unified DevOps Platform
provides actionable insights on developer-level productivity, pinpoints areas to improve adoption
and includes reporting on the quality and success of AI suggestions.

Users can:
● Unify metrics across the “Code to Cloud” journey, incorporating DevEx KPIs (time
to PR, lead time, cycle time, and performance), source code metrics (commits, PRs,
throughput, quality, and security), and DORA metrics (deployment frequency, change
failure rate, lead time, and MTTR). This comprehensive approach allows you to measure
impact, acceptance rate, and velocity effectively.
● Gain actionable insights into team performance, including throughput, quality, velocity,
security, and stability, as well as developer-level metrics, to pinpoint areas for
improvement and optimize processes.
● Seamlessly integrate with leading AI code assistants like GitHub Copilot and Amazon
Q, offering a unique, holistic “single pane of glass” view of the entire development
lifecycle.

“AI Code Assistants are critical for developer productivity and efficiency, and we are proud to
enable engineering teams to adopt them and realize their benefits faster than ever before and
provide metrics on the positive impact,” said Kumar Chivukula, co-founder and CEO of Opsera.
“With Opsera’s Unified DevOps Platform, we provide persona and team-level insights, pinpoint
bottlenecks and inefficiencies using Opsera Hummingbird AI, and measure security and quality
across tools to help enterprises improve overall developer productivity and experience.”

Unlike other platforms, Opsera integrates with the entire software development lifecycle with
over 100 native integrations and unified data for SDLC, IaC, SaaS applications like Salesforce,
Databricks, and Snowflake, and mobile application development. This helps teams maximize
their investment and provides the most comprehensive

The post Opsera extends AI Code Assistant Insights for developer productivity appeared first on SD Times.

]]>
Accelerating innovation: How the Lucid visual collaboration suite boosts Agile team efficiency https://sdtimes.com/softwaredev/accelerating-innovation-how-the-lucid-visual-collaboration-suite-boosts-agile-team-efficiency/ Tue, 01 Oct 2024 13:00:01 +0000 https://sdtimes.com/?p=55753 Fostering a positive developer experience and aligning it with business goals may seem like an obvious focus for organizational stakeholders. When developers feel empowered to innovate, they deliver customer experiences that positively impact the bottom line. Yet key organizational stakeholders still struggle to get visibility into how products are advancing, from ideation to delivery. To … continue reading

The post Accelerating innovation: How the Lucid visual collaboration suite boosts Agile team efficiency appeared first on SD Times.

]]>
Fostering a positive developer experience and aligning it with business goals may seem like an obvious focus for organizational stakeholders. When developers feel empowered to innovate, they deliver customer experiences that positively impact the bottom line. Yet key organizational stakeholders still struggle to get visibility into how products are advancing, from ideation to delivery.

To help those teams gain insights into how products are advancing, Lucid Software is announcing enhancements to its visual collaboration platform that are designed to help elevate agile workflows by cultivating greater alignment, creating clarity and improving decision-making. 

“Visual collaboration is about seeing an entire workflow from the very beginning, enabling teams to align, make informed decisions and guide the initiative all the way to market delivery,” said Jessica Guistolise, an evangelist, Agile coach and consultant at Lucid. “Lucid excels at bringing all necessary information into one platform, supporting teams regardless of whether they follow Agile or simply need to iterate faster.”

Visuals, Guistolise said, are important for getting all stakeholders on the same page and improving the overall developer experience. “Prior to the pandemic, agile teams would gather in one room surrounded by visuals and sticky notes that displayed their work, vision, mission and tracked dependencies. Then, we all went home. Now where does all that information live?” Lucid, Guistolise explained, became a centralized hub for teams that have everything they need to do their work, day in and day out. 

Lucid’s latest release includes an emphasis on team-level coordination and program-level planning. On the team level, there are features for creating dedicated virtual team spaces for organizing such critical artifacts as charters, working agreements and more. Lucid’s platform replicates the benefits of physical team rooms and serves as a central hub for collaboration, where all needed documents are stored and can be shared. On the program level, real-time dependency mapping enables visualization and management of those dependencies directly from Jira and ADO. Other new features are structured big room planning templates to coordinate cross-functional work and the ability to sync project data between Lucid, Jira and ADO to have the most current information reflected across all platforms.

When it comes to team-level coordination, team spaces are customizable, allowing for a more personalized and engaging work experience. “When working with distributed teams, fostering a sense of team connection can be a challenge,” Guistolise said. “This brings some of that humanity and team experience. ‘What did you do this weekend? Can I see a picture of your dog?’ All of that can be done visually and it cultivates a shared understanding of one another, and not just of the work that we’re doing.” 

Speaking to how these features enhance the developer experience, Guistolise came to embrace agility because, she said, “when we bring humanity back into the workplace and elevate the overall team experience, we not only boost collaboration and efficiency but also foster connection that makes those moments more enjoyable.”

Customizable Agile templates are also available to help guide teams through daily standups, sprint planning retrospectives and other Agile events by offering integrated tools such as timers, laser pointers and the ability to import Jira issues. 

Lucid also offers a private mode to allow for anonymous contributions of ideas and feedback. Guistolise explained that private mode offers psychological safety “to allow for those voices who may not feel comfortable speaking up or even dissenting in a meeting.” Private mode, she added, still allows teams to surface that information anonymously, which means better decisions will be made in the long run. The release also includes new estimation capabilities for streamlining sprint planning using a poker-style approach, and those estimates can be synced with Jira or ADO to align planning and execution.

Further, two-way integrations with Jira and Azure DevOps mean that “no one has to take pictures of the sticky notes on the walls and then type it into a back-end system so there’s a record of what is going on,” she said. Instead, because of the integrations, everything moves automatically back and forth between systems, providing updated, real-time information upon which to make those business and development decisions.

These latest innovations from Lucid Software empower developer teams to have a more positive working experience by providing the tools they need to navigate the complexities of Agile workflows, from daily coordination to large-scale program planning. By enhancing both team-level and program-level collaboration, Lucid continues to lead the way in providing the most intelligent and comprehensive visual collaboration platform to support modern teams.

 

The post Accelerating innovation: How the Lucid visual collaboration suite boosts Agile team efficiency appeared first on SD Times.

]]>
PostgreSQL 17 adds performance gains, storage optimizations and more https://sdtimes.com/data/postgresql-17-adds-performance-gains-storage-optimizations-and-more/ Fri, 27 Sep 2024 15:57:48 +0000 https://sdtimes.com/?p=55733 The PostgreSQL Global Development Group announced the release of PostgreSQL 17, the newest version of the open source database. According to the group’s announcement, PostgreSQL 17 has improved performance and scalability while adapting to new data access and storage patterns required by cloud native computing and the rise of AI. Among the key new features … continue reading

The post PostgreSQL 17 adds performance gains, storage optimizations and more appeared first on SD Times.

]]>
The PostgreSQL Global Development Group announced the release of PostgreSQL 17, the newest version of the open source database.

According to the group’s announcement, PostgreSQL 17 has improved performance and scalability while adapting to new data access and storage patterns required by cloud native computing and the rise of AI.

Among the key new features is enhanced support for JSON, which was one of the reasons users began adopting the database. In this release, the implementation of the SQL/JSON standard is mostly complete, according to Tom Kincaid, SVP of Database Server Development at EDB, a major contributor to the project. “I think one of the things people ask for the most is the implementation of JSON_TABLE, which enables you to take a JSON document and make a view of it as a relational table… it really speaks to the extensibility of Postgres, but also the continued evolution towards making it easier to adopt.”

PostgreSQL 17 now supports SQL/JSON constructors (JSONJSON_SCALARJSON_SERIALIZE) and query functions (JSON_EXISTSJSON_QUERYJSON_VALUE), according to the group’s announcement, giving developers other ways of interfacing with their JSON data. This release adds more jsonpath expressions, with an emphasis of converting JSON data to a native PostgreSQL data type, including numeric, boolean, string, and date/time types, the group announced.

Further, according to the group’s announcement, the new version brings more features to MERGE, which is used for conditional updates. New capabilities include a RETURNING clause and the ability to update views, as well as bulk loading and data exporting, including up to a 2x performance improvement when exporting large rows using the COPY command. “COPY performance also has improvements when the source and destination encodings match, and includes a new option, ON_ERROR, that allows an import to continue even if there is an insert error,” the announcement said.

Another key feature Kincaid pointed out is the addition of incremental backup. “Postgres databases can be 10s of terabytes, in some cases, 100 terabytes or more, and being able to do a full backup on that could take several days, just depending on your hardware and your architecture and tools,” he said. “But now with incremental backup,  you can combine incremental backups into a full backup, separate from the actual database server, so you don’t have to put some extensive load on your database server to do a full backup.”

PostgreSQL 17 brings system-wide performance gains and a more robust developer experience.  Here are the release notes for a complete list of new and changed features.

The post PostgreSQL 17 adds performance gains, storage optimizations and more appeared first on SD Times.

]]>
The state of open source maintainers https://sdtimes.com/open-source/the-state-of-open-source-maintainers/ Tue, 17 Sep 2024 14:36:19 +0000 https://sdtimes.com/?p=55662 Open source maintainers do significantly more security and maintenance work than unpaid maintainers, yet 60% of all maintainers remain unpaid, according to the 2024 State of Open Maintainer report from Tidelift. “The health and security of our global software infrastructure depends on open source maintainers,” Donald Fischer, co-founder and CEO, Tidelift, said in an announcement … continue reading

The post The state of open source maintainers appeared first on SD Times.

]]>
Open source maintainers do significantly more security and maintenance work than unpaid maintainers, yet 60% of all maintainers remain unpaid, according to the 2024 State of Open Maintainer report from Tidelift.

“The health and security of our global software infrastructure depends on open source maintainers,” Donald Fischer, co-founder and CEO, Tidelift, said in an announcement of the report. “Paying maintainers improves their ability to ensure their projects meet the stringent security requirements that enterprise users require. These survey results show that organizations can positively impact their own security by funding the important work of the open source maintainers whose projects they rely on.”

Among the report’s key findings are that 16% of the 400 respondents to a Tidelift survey identified as unpaid hobbyists and would not want to get paid, while 44% of those unpaid said they would appreciate getting paid. The report noted concern that the percentage of maintainers getting paid for their work hasn’t changed, even with organizations placing a greater focus on software supply chain security.

Maintainers who are paid get their income through donation programs, employers and Tidelift, which did the survey.

About half of the maintainers surveyed said they are underappreciated, and 43% of them said it adds stress to their lives. Not surprisingly, 60% of maintainers have either quit or considered quitting the maintenance work.

One area that has seen growth is in the percentage of maintainers aware of such things as the OpenSSF Scorecard project, the NIST Secure Software Development Framework and the SLSA framework, with the percentage of those unaware of such standards and initiatives decreasing from 52% in 2023 to 40% this year, according to the report.

In light of the XZ Utils hack, two-third of respondents said they are less trusting of pull requests from non-maintainers, but only 37% reported they are less trusting of co-maintainer contributions. According to the report, one maintainer wrote in response to this question:  “I feel the need to add a layer of vetting, but adding any additional layer of friction to a possible open source contributor would just scare them away. I cannot afford to be pushing people away.”

When it comes to AI-based coding tools, maintainers expressed concern, with 45% saying these tools withh have a somewhat negative or negative impact on their work, and 64% saying they’d be less likely to accept contributions they knew were creating using AI. The report found that younger maintainers are more likely to use AI-based tools than their senior counterparts.

You can read the full report here.

The post The state of open source maintainers appeared first on SD Times.

]]>
Transition application code to images with Cloud Native Buildpacks https://sdtimes.com/cloud/transition-application-code-to-images-with-cloud-native-buildpacks/ Mon, 26 Aug 2024 14:56:19 +0000 https://sdtimes.com/?p=55532 Much of the conversation in the software industry is around developer experience. From new ways to measure productivity to reducing important but drudge work, organizations are looking to make life more joyful for developers. One area that’s gaining more attention is the use of buildpacks to create apps for cloud-native environments. Though not a new … continue reading

The post Transition application code to images with Cloud Native Buildpacks appeared first on SD Times.

]]>
Much of the conversation in the software industry is around developer experience. From new ways to measure productivity to reducing important but drudge work, organizations are looking to make life more joyful for developers.

One area that’s gaining more attention is the use of buildpacks to create apps for cloud-native environments. Though not a new concept – buildpacks have been around for about 15 years – they can ease the burden on developers by simply taking source code and turning it into fully functional apps.

A quick history, according to Ram Iyengar, chief evangelist at Cloud Foundry: Heroku brought up the concept of creating immutable objects from source code, regardless of programming language or platform, in 2010. Cloud Foundry (the open source project) was working to do much the same thing, but as open source. Pivotal was an early backer and developer of the Cloud Foundry project as a commercial tool, and both projects released a v2 in 2015. But when Pivotal was acquired by VMware in 2019, the Cloud Foundry Foundation was formed to shepherd the project, and that is now under the auspices of the Cloud Native Computing Foundation.

Pivotal’s path was to make containers out of the source code provided, while Heroku’s vision did not include containers. In the cloud native vs. non-cloud native debate, there exists a divide in which everything runs in containers, and where not everything runs in containers. So, Heroku and Pivotal/Cloud Foundry came together to create Cloud Native Buildpacks that would be compatible with the cloud native ecosystem, which, Iyengar said, meant that “it had to be open source, it had to adhere to the OCI specification, and it has to be ready to deploy on Kubernetes and make use of cloud native constructs.” 

The non-Kubernetes version 2 of buildpacks, Iyengar said, will continue to exist for the foreseeable future, while the “newer, shinier version of buildpacks in the one for containers and Kubernetes,” he said.

Heroku went ahead with its closed source commercial implementation – which has since been open-sourced –  while Cloud Foundry Foundation in 2020 created Paketo buildpacks, which is open source and production-ready, Iyengar said.

All about the developer experience

Among the benefits of buildpacks, as we bring the narrative back around, is improving the developer experience. While there are six or seven ways JavaScript developers can get this experience of having tooling give you a functional app from source code, but if you’re not using JavaScript, the tool is basically useless, Iyengar said. Packeto buildpacks enable developers to get the same build experience regardless of the source code language. 

“The kind of homogeneity that’s possible with buildpacks is phenomenal, and that’s really what I mean when I say developer experience,” Iyengar said. “It’s about allowing developers to bring any language or framework and providing them with the homogeneous and complete user interface in order to give them the best-in-class developer experience that is possible.”

Iyengar also pointed out that buildpacks can overcome automation hurdles that exist when using technologies such as Docker. “For a developer or software engineering team to maintain Docker files for local development and production, it can quickly become a big sort of development hell in creating these Docker files and maintaining them,” he said. “Buildpacks relieve users of having to write these meta files and maintain them.”  He explained that with a Docker-based build process, if you want to write a different Docker file for your GitHub actions versus if you’re running them on your pre-production machines, there are different requirements. It’s not the most optimal.” Buildpacks, he said, make the process uniform irrespective of the infrastructure you’re running on. 

The same is true for SBOMs – software bills of materials – and going forward, you’ll be able to choose between x86 images and ARM images and dictate in the build process what kind of image you want and make them all available, Iyengar said. “The focus on automation within the buildpacks community is huge.” Further, he noted, the project makes available production-ready Buildpacks that are also compatible with CI/CD integrations such as CircleCI, Gitlab, Tekton, and others.

Because buildpacks provide transparency into what’s in an image, and what images can and cannot contain, this is where buildpacks and AI cross. “Any AI that is able to read and parse buildpacks metadata can very conveniently look at what policies need to be set, and you can create rules like do not create or push containers to production if they contain a particular version of, say, Go that’s outdated or has a vulnerability,” Iyengar said. “And, if a new vulnerability gets detected, there can be an AI engine that basically turns through all of the buildpack layers and says, ‘these are the layers that are affected, let’s replace them immediately.’ Mitigation, he added, becomes a very trivial operation.

Iyengar stated that the focus within the buildpacks community has been to “plug a lot of gaps that the Docker-based ecosystem has left, but it’s really about knowing what’s inside an image when you’re deploying it.”  Buildpacks, he said, make it easy to attest and create provenance that images need in our modern, security-first cloud native landscape.  Going forward, built-in SBOMs won’t just be a convenience, they’ll be a compliance requirement.

 

The post Transition application code to images with Cloud Native Buildpacks appeared first on SD Times.

]]>
ValueOps Insights provides unified view of analytics for software value planning and delivery https://sdtimes.com/valuestream/valueops-insights-provides-unified-view-of-analytics-for-software-value-planning-and-delivery/ Thu, 08 Aug 2024 17:26:49 +0000 https://sdtimes.com/?p=55396 Broadcom today announced ValueOps Insights, a solution that connects and normalizes analytics from siloed tools into a unified view to ensure organizations are able to assess if value stream delivery capabilities align with business goals. The new solution, underpinned by the ConnectALL platform it acquired in June 2023, gathers, organizes and evaluates disparate DORA and … continue reading

The post ValueOps Insights provides unified view of analytics for software value planning and delivery appeared first on SD Times.

]]>
Broadcom today announced ValueOps Insights, a solution that connects and normalizes analytics from siloed tools into a unified view to ensure organizations are able to assess if value stream delivery capabilities align with business goals.

The new solution, underpinned by the ConnectALL platform it acquired in June 2023, gathers, organizes and evaluates disparate DORA and flow metrics to provide real-time, role-based dashboards to development teams, dev managers and organization leaders to use for informed decision-making.  

“By integrating and organizing data from diverse sources across the value chain, ValueOps Insights provides the information organizations need to make better business decisions,” said Jean-Louis Vignaud, Head of ValueOps in Broadcom’s Agile Operations Division. The ability to match investment with the capability to deliver products leads to “successful value realization,” the company noted in its announcement.

This enables monitoring of investment decisions against product outcomes, and confirmation that planned product capabilities translate into tangible investment outcomes. By aligning investment intent with execution capability, we help organizations ensure successful value realization.

DORA and Flow metrics are all about delivery efficiency, but Vignaud noted, “that doesn’t mean we’re smart in what we do. The ideal view of the word is, ‘I plan for value, I deliver value and I measure the value realization.” While the full vision for Insights includes a value planning tool that will be integrated in the next quarter, Vignuad said, “We can start to be a bit smarter because of Flow analytics and DORA metrics.”

To broaden the value proposition, Broadcom is working on value realization. As Vignaud explained, “We capture early metrics, ensuring that indeed you are realizing the value you said you would be realizing when you do the investment.” 


You may also like…

Broadcom’s ‘Three Pillars’ of value stream management

Organizational alignment is the key to delivering customer value

The post ValueOps Insights provides unified view of analytics for software value planning and delivery appeared first on SD Times.

]]>
Trust Agent can show if developers know their stuff https://sdtimes.com/softwaredev/trust-agents-can-show-if-developers-know-their-stuff/ Fri, 02 Aug 2024 14:45:53 +0000 https://sdtimes.com/?p=55327 Developers, it appears, will not be replaced by artificial intelligence – at least not yet, anyway. What they will need to do is learn or improve their skills in providing templates for AI, become masters of fixing problems in AI-generated code, and actually learn the best uses for AI in software development. In its current … continue reading

The post Trust Agent can show if developers know their stuff appeared first on SD Times.

]]>
Developers, it appears, will not be replaced by artificial intelligence – at least not yet, anyway. What they will need to do is learn or improve their skills in providing templates for AI, become masters of fixing problems in AI-generated code, and actually learn the best uses for AI in software development.

In its current state, AI has given users pause, due to hallucinations, inaccuracies, and simply making up an answer if it doesn’t know one. As Long Island music legend Billy Joel wrote, “it’s a matter of trust.”

To help developers gain confidence in AI, and to help organizations assess if those developers have the requisite skills to ensure code is secure, the company Secure Code Warrior (SCW) will be discussing its new Trust Agent at the upcoming Black Hat conference, according to company co-founder and CTO Matias Madou. That builds on the Trust Score the company announced at the RSA Conference in April.

AI, Madou said, “doesn’t eradicate smart people. While a developer will be able to be more productive, if he or she doesn’t get more educated, they’ll only be creating bad code at rapid speeds. They will be faster, they will crank out more features, but only quality features, and not secure features.”

Many organizations have no idea if secure developers are creating code, or not. “Directors of AppSec, CISOs, find it’s really hard to know,” Madou said. “So what we’ve done is we can give you insights in your repositories, we can tell you if code was created by secure developers or insecure developers.”

The Trust Score is a way to determine how well-trained a developer is to write secure code, and their work can be compared to a benchmark. “We can give insight into how well are your developers in your organization creating secure code? How well-trained are they in creating secure code? And essentially, your trust score is an aggregate of all the skill scores of your developers, based on all their data as they work through the platform,” Madou explained. “So every individual developer that goes through our platform that takes training, that upskills himself or herself, gets a skill score, and the aggregate of the skill scores is a Trust Score.”

“We sit on a mountain of data, of 250,000 active learners today, around 600 enterprise companies and 20 million data points,” Madou explained. “So we asked the group of data scientists, ‘hey, if you look at the data here, can you figure out what a skilled developer looks like solely by looking at the data of how people go through our platform?’ “

SCW’s Trust Agent, which integrates with GitLab, GitHub and Bitbucket –”all the Gits,” he said – doesn’t look at code, or check for errors. It will pick up metadata about a developer when he or she checks in code. Does that developer have a Trust Score? What level of secure coding is he or she at? Do they know what they’re doing? Based on that, they can say if a developer is secure or not.

SCW found that some developers are very meticulous, with high accuracy, showing they know what they’re doing. Others click through the platform simply for compliance, and aren’t learning anything, and that’s visible in those patterns. “So out of the data, they were able to distill a pattern of what a secure developer looks like. And out of that, they get a score. If they do this, and do that, if they have high accuracy, and they touch on the OWASP Top 10, we can give them a high Trust Score, because they want to learn, and they understand that first they learn, then they prove.”

The Trust Agent, Madou said, can now see, “Oh, you’re doing something. Let me tell you about that developer. Let me tell you if that developer knows his or her stuff, or if they don’t.”


You may also like…

Code in the fast lane: Why secure developers can ship at warp speed

Generative AI development requires a different approach to testing

The post Trust Agent can show if developers know their stuff appeared first on SD Times.

]]>
Developers, leaders disconnect on productivity, satisfaction https://sdtimes.com/softwaredev/developers-leaders-disconnect-on-productivity-satisfaction/ Tue, 16 Jul 2024 17:07:27 +0000 https://sdtimes.com/?p=55183 The advent of DevOps, cloud-native computing, API use and now AI have made creating software way more complex for developers. These factors have also impacted the developers’ experience and productivity – and how productivity is measured. No longer do software engineers simply write code and run some tests. Now, they have to manage API integration … continue reading

The post Developers, leaders disconnect on productivity, satisfaction appeared first on SD Times.

]]>
The advent of DevOps, cloud-native computing, API use and now AI have made creating software way more complex for developers. These factors have also impacted the developers’ experience and productivity – and how productivity is measured.

No longer do software engineers simply write code and run some tests. Now, they have to manage API integration for required services, security through the use of software bills of materials, the maintenance of these complex applications, and now learn to use AI and understand the risks associated with all of the above.

According to a study released Monday by Atlassian, of the 2,100 practitioners surveyed, the top five areas of developer role complexity are:

  • Understaffing – this forces developers to take on responsibilities of other roles (48% of  respondents)
  • Expansion of the developer role – bringing in testing, security, operations and maintenance (47%)
  • New technology – developers need training on such things as AI and other new tech (47%)
  • Switching context between many tools – tool sprawl is a big issue for organizations (43%)
  • Collaboration with other teams – this can be avoided through more effective use of tools (43%)

Development team leaders say they understand the importance of the developer experience (DevEx). In the study, 86% of leaders believe that attracting and retaining the best talent is nearly impossible without a great developer experience.

Unfortunately, less than half of the developers surveyed believe their organizations prioritize developer experience.

Most organizations today realize that developer experience and productivity are closely related. Andrew Boyagi, head of DevOps evangelism at Atlassian, believes there are three key factors to creating a positive experience: being able to maintain a flow state, reduced cognitive load, and a constant feedback loop. “When developers have access to the information they need in a centralized format and can review progress in regular data-informed retrospectives, they are able to get more work done and have a more enjoyable experience doing it,” Boyagi said. 

Among the tactics he said Atlassian has seen success with to achieve that ‘developer hat trick’ are providing powerful DevOps tools, empowering teams to take more control over their roadmaps, and creating an engineering culture “that encourages experimentation and knowledge sharing. But the first and most important step is to speak with your developers. You can’t begin to improve friction points if you don’t fully understand where those friction points are,” he explained.

One technique organizations are using to reduce friction points is through internal developer portals (IDP) and platform engineering. The goal of platform engineering is to standardize tooling, but it comes with both benefits and pitfalls. The obvious benefits, according to Boyagi, are reduced software tool costs and reduced developer complexity created by tool sprawl. Among the downsides are sacrificing best-of-breed tooling that developers have come to rely on, or removing functionality that’s required by specific teams within an organization.

“Creating a positive DevEx is a balancing act,” Boyagi said. “In large organizations, a good approach is to standardize on certain areas of tooling, and allow flexibility in others. For example, it’s logical to standardize on a source code repository, so all code is in one place. You may, however, allow teams to choose from a variety of testing tools. Regardless of strategy, for a positive DevEx it’s important that tools are integrated in a way that minimizes context switching, developers outside the platform team have a voice in the selection of tools, and there is a feedback mechanism for the ongoing performance of tooling.”

Developers as generalists

Ethan Sumner, founder and CEO at research and analysis startup DevEx Connect, said the adoption of DevOps practices has turned software developers into generalists, doing a little bit of a lot of different roles. 

“Very early on in my career, I worked for an extremely small company, there were four of us,” he said. “We were all developers, there was no operations. It was just developers, and the operations side was absolutely atrocious. When we did a deployment, it took two days for us to do it, not two minutes, like all these large enterprises have got operations down to a tee. 

“And all of our developer environments were built using Oracle VirtualBox, which took three hours to spin up,” Sumner continued. “And it was a productivity nightmare. But afterwards, I went down to MasterCard, where we did operationally things extremely well. Having these kinds of build environments, development environments, a lot of developers just want to develop and code all day; they don’t want to worry about which kind of staging environment, how does it look going into production, a lot of them don’t want to be on call. I think a lot of organizations are trying to put code developers as true generalists, when really, there should still be a bit of segregation between these kinds of roles. You know, people develop, people operate.” 

Measuring productivity

Before software became so complex, developer productivity was basically measured in the number of lines of code written per day, or hours working. Today, that fails to take into account the wait times associated with the silos organizations have created to separate out work, as well as other inefficiencies, such as waiting on pull requests or even using time to learn more about testing and security.

According to the survey, 41% of organizations use tools that measure developer productivity to assess development team satisfaction. This, the survey said, raises a red flag about whether or not an organization is tracking the proper metrics with the correct tools. 

“Our survey found that more than half of the engineering leaders using [these kinds of] metrics … find them ineffective as a measure of developer productivity,” Boyagi said. “While you can measure productivity, there is no one metric, or set of metrics that rules them all. This is because developer experience and productivity are highly contextual between teams and organizations. Organizations need to look at things from a 360-degree view and focus on three things: developer sentiment (how they feel about their work and environment), workflows (how efficient and reliable systems and processes are), and KPIs (the measures your team obsesses over, based on your specific situation).”

Will AI be a game-changer?

A study by IDC predicts that $40 billion will be spent on AI tools this year. And Atlassian’s study found that development leaders believe that using AI is the most effective way of improving both productivity and satisfaction.

Yet only 30% of responding developers said AI-based development tools will improve personal productivity, and 32% responded “only slightly.” This continues to show the disconnect between how leaders view productivity and satisfaction, and how developers see it.

“AI can help improve developer experience, but it can’t solve all the pain points of development teams to improve productivity and satisfaction,” Boyagi noted. “There is the potential for significant gains in things like incident response, info searching, and documentation but only if applied as a solution to an actual issue developers in an organization are facing. It’s critical for leaders to ask developers about their friction points and then focus on implementing the right solutions and cultural changes to make a difference.”


You may also like…

IDPs may be how we solve the development complexity problem

Q&A: Why over half of developers are experiencing burnout

The post Developers, leaders disconnect on productivity, satisfaction appeared first on SD Times.

]]>
A guide to supply chain security tools https://sdtimes.com/security/a-guide-to-supply-chain-security-tools/ Mon, 08 Jul 2024 17:59:18 +0000 https://sdtimes.com/?p=55122 The following is a listing of vendors that offer tools to help secure software supply chains, along with a brief description of their offerings. Featured Provider HCLSoftware: HCL AppScan empowers developers, DevOps, and security teams with a suite of technologies to pinpoint application vulnerabilities for quick remediation in every phase of the software development lifecycle. … continue reading

The post A guide to supply chain security tools appeared first on SD Times.

]]>
The following is a listing of vendors that offer tools to help secure software supply chains, along with a brief description of their offerings.


Featured Provider

HCLSoftware: HCL AppScan empowers developers, DevOps, and security teams with a suite of technologies to pinpoint application vulnerabilities for quick remediation in every phase of the software development lifecycle. HCL AppScan SCA (Software Composition Analysis) detects open-source packages, versions, licenses, and vulnerabilities, and provides an inventory of all of this data for comprehensive reporting.

See also: Companies still need to work on security fundamentals to win in the supply chain security fight

Other Providers

Anchore offers an enterprise version of its Syft open-source software bill of materials (SBOM) project, used to generate and track SBOMs across the development lifecycle. It also can continuously identify known and new vulnerabilities and security issues.

Aqua Security can help organizations protect all the links in their software supply chains to maintain code integrity and minimize attack surfaces. With Aqua, customers can secure the systems and processes used to build and deliver applications to production, while monitoring the security posture of DevOps tools to ensure that security controls put in place have not been averted.

ArmorCode‘s Application Security Posture Management (ASPM) Platform helps organizations unify visibility into their CI/CD posture and components from all of their SBOMs, prioritize supply chain vulnerabilities based on their impact in the environment, and find out if vulnerability advisories really affect the system.

Contrast Security: Contrast SCA focuses on real threats from open-source security risks and vulnerabilities in third-party components during runtime. Operating at runtime effectively reduces the occurrence of false positives often found with static SCA tools and prioritizes the remediation of vulnerabilities that present actual risks. The software can flag software supply chain risks by identifying potential instances of dependency confusion.

FOSSA provides an accurate and precise report of all code dependencies up to an unlimited depth; and can generate an SBOM for any prior version of software, not just the current one. The platform utilizes multiple techniques — beyond just analyzing manifest files — to produce an audit-grade component inventory.

GitLab helps secure the end-to-end software supply chain (including source, build, dependencies, and released artifacts), create an inventory of software used (software bill of materials), and apply necessary controls. GitLab can help track changes, implement necessary controls to protect what goes into production, and ensure adherence to license compliance and regulatory frameworks.

Mend.io: Mend’s SCA automatically generates an accurate and deeply comprehensive SBOM of all open source dependencies to help ensure software is secure and compliant. Mend SCA generates a call graph to determine if code reaches vulnerable functions, so developers can prioritize remediation based on actual risk.

Revenera provides ongoing risk assessment for license compliance issues and security threats. The solution can continuously assess risk across a portfolio of software applications and the supply chain. SBOM Insights supports the aggregation, ingestion, and reconciliation of SBOM data from various internal and external data sources, providing the needed insights to manage legal and security risk, deliver compliance artifacts, and secure the software supply chain.

Snyk can help developers understand and manage supply chain security, from enabling secure design to tracking dependencies to fixing vulnerabilities. Snyk provides the visibility, context, and control needed to work alongside developers on reducing application risk.

Sonatype can generate both CycloneDX and SPDX SBOM formats, import them from third-party software, and analyze them to pinpoint components, vulnerabilities, malware, and policy violations. Companies can prove their software’s security status easily with SBOM Manager, and share SBOMs and customized reports with customers, regulators, and certification bodies via the vendor portal.

Synopsys creates SBOMs automatically with Synopsys SCA. With the platform, users can import third-party SBOMs and evaluate for component risk, and generate SPDX and CycloneDX SBOMs containing open source, proprietary, and commercial dependencies.

Veracode Software Composition Analysis can continuously monitor software and its ecosystem to automate finding and remediating open-source vulnerabilities and license compliance risk. Veracode Container Security can prevent exploits to containers before runtime and provide actionable results that help developers remediate effectively.

Open Source Solutions

CycloneDX: The OWASP Foundation’s CycloneDX is a full-stack Bill of Materials (BOM) standard that provides advanced supply chain capabilities for cyber risk reduction. Strategic direction of the specification is managed by the CycloneDX Core Working Group. CycloneDX is also backed by the Ecma International Technical Committee 54 (Software & System Transparency).

SPDX is a Linux Foundation open standard for sharing SBOMs and other important AI, data, and security references. It supports a range of risk management use cases and is a freely available international open standard (ISO/IEC 5692:2021).

Syft is a powerful and easy-to-use CLI tool and library for generating SBOMs for container images and filesystems. It also supports CycloneDX/SPDX and JSON format. Syft can be installed and run directly on the developer machine to generate SBOMs against software being developed locally or can be pointed at a filesystem. 

The post A guide to supply chain security tools appeared first on SD Times.

]]>