OpenCitations on JRTDD web site

Dear readers,

We created an link of  OpenCitations on JRTDD web site. 

OpenCitations is a scholarly infrastructure organization dedicated to open scholarship and the publication of open bibliographic and citation data by the use of Semantic Web (Linked Data) technologies, and engaged in advocacy for semantic publishing and open citations. It provides the OpenCitations Data Model and the SPAR (Semantic Publishing and Referencing) Ontologies for encoding scholarly bibliographic and citation data in RDF, and open software of generic applicability for searching, browsing and providing APIs over RDF triplestores. It has developed the OpenCitations Corpus (OCC) of open downloadable bibliographic and citation data recorded in RDF, and a system and resolution service for Open Citation Identifiers (OCIs), and it is currently developing a number of Open Citation Indexes using the data openly available in third-party bibliographic databases.

OpenCitations is currently working to expand and improve the supporting infrastructure of the OpenCitations Corpus (OCC), our open repository of scholarly citation data made available under a Creative Commons public domain dedication, which provides in RDF accurate citation information (bibliographic references) harvested from the scholarly literature. These are described using the SPAR Ontologiesaccording to the OpenCitations Data Model, and are made freely available so that others may freely build upon, enhance and reuse them for any purpose, without restriction under copyright or database law.

JRTDD Editor-in-chief

JRTDD articles into XML files

Dear readers,

I want to announce that you can find #JRTDD articles into XML files which will increase the visibility and probably indexing of our journal.

What is XML?

XML is a file extension for an Extensible Markup Language (XML) file format used to create common information formats and share both the format and the data on the World Wide Web, intranets, and elsewhere using standard ASCII text.

XML is similar to HTML. Both XML and HTML contain markup symbols to describe the contents of a page or file. HTML, however, describes the content of a Web page (mainly text and graphic images) only in terms of how it is to be displayed and interacted with. For example, the letter “p” placed within markup tags starts a new paragraph.

XML describes the content in terms of what data is being described. For example, the word “phonenum” placed within markup tags could indicate that the data that followed was a phone number. An XML file can be processed purely as data by a program or it can be stored with similar data on another computer or it can be displayed, like an HTML file. For example, depending on how the application in the receiving computer wanted to handle the phone number, it could be stored, displayed, or dialed.

You can find articles from Vol.1, Issue 1 in XML here. I would like to say big gratitude to our web administrator @Gjorgji Pop Gjorgjiev for giving us this opportunity.

JRTDD Editor-in-chief

Open Access Week in the Netherlands

The International Open Access Week took place from 22 until 26 October. A range of events were held across the Netherlands, based on the theme ‘Designing Equitable Foundations for Open Knowledge’. The Netherlands Open Science Festival, for instance, was organised jointly by the university libraries, SURF, the National Open Science Platform and the PhD Network of the Netherlands (PNN). This festival for researchers centred on the question of how scientists can make their own research open. Participants shared experiences, new insights and practical tools.

In addition, Marjan Grootveld (DANS) gave two interactive webinars entitled ‘Q&A FAIR data and trusted repositories’ and ‘Openness, exchange, FAIR Data – oh brave new world that has such vision in’t!’. The presentations were a plea for FAIR research data, stored in reliable repositories. To conclude the Open Access Week, SURF, Fontys and TU Delft organised the seminar ‘Open Science meets Open Education’.

In addition to the various events held during the week, the VSNU also published a daily vlog featuring the major stakeholders involved in open access, such as Minister Ingrid van Engelshoven, chief negotiator Koen Becking and PNN chair Anne de Vries. Those vlog posts are available here.

Open access developments in the Netherlands

Open access: intermediate results in the Netherlands
In late 2013, State Secretary Dekker formulated objectives with regard to open access, which were then tightened in the National Open Science Plan at the start of 2017: ‘100% open access publishing by 2020’. How much progress have we made so far?

Experts from all universities have established a definition framework that can be used to determine the percentage of articles published open access and to distinguish between ‘gold’, ‘hybrid’ and ‘green’. Figures from 2017 reveal that 50% of the peer-reviewed articles from 14 Dutch universities are available open access (on a total of 55,713 articles). This was true of 42% of articles in 2016. At most universities, the highest percentage of open access articles was found in the category ‘Hybrid and not DOAJ OA’ (20% in 2016 and 23% in 2017).

Open Access Week 22-28 October 2018

Theme of 2018 International Open Access Week To Be “Designing Equitable Foundations for Open Knowledge”

The 2018 Open Access Week Advisory Committee is pleased to announce that the theme for the 2018 International Open Access Week, to be held October 22-28, will be “designing equitable foundations for open knowledge.”

This year’s theme reflects a scholarly system in transition. While governments, funders, universities, publishers, and scholars are increasingly adopting open policies and practices, how these are actually implemented is still in flux. As open becomes the default, all stakeholders must be intentional about designing these new, open systems to ensure that they are inclusive, equitable, and truly serve the needs of a diverse global community. This year’s Open Access Week invites all interested stakeholders to participate in advancing this important work.

Setting the default to open is an essential step toward making our system for producing and distributing knowledge more inclusive, but it also comes with new challenges to be addressed. How do we ensure sustainability models used for open access are not exclusionary? What are inequities that open systems can recreate or reinforce? Whose voices are prioritized? Who is excluded? How does what counts as scholarship perpetuate bias? What are areas where openness might not be appropriate?

These are not questions with easy answers. Rather, they are prompts for ongoing conversations that can help ensure that the foundation for a more equitable system of open research and scholarship is created thoughtfully and collaboratively. This year’s theme highlights the importance of asking the tough questions, staying critical, and actively engaging in an ongoing conversation to learn from diverse perspectives about how to make scholarship more equitable and inclusive as it becomes more open.

Established by SPARC and partners in the student community in 2008, International Open Access Week is an opportunity to take action in making openness the default for research—to raise the visibility of scholarship, accelerate research, and turn breakthroughs into better lives. This year’s Open Access Week will be held from October 22nd through the 28th; however, those celebrating the week are encouraged to schedule local events whenever is most suitable during the year and to utilize themes that are most effective locally.

The global, distributed nature of Open Access Week will play a particularly important role in this year’s theme. Strategies and structures for opening knowledge must be co-designed in and with the communities they serve—especially those that are often marginalized or excluded from these discussions altogether.

International Open Access Week is an important opportunity to catalyze new conversations, create connections across and between communities that can facilitate this co-design, and advance progress to build more equitable foundations for opening knowledge—discussion and action that must continue throughout the year, year in and year out. Diversity, equity, and inclusion must be prioritized year-round and integrated into the fabric of the open community, from how our infrastructure is built to how we organize community events.

For more information about International Open Access Week, please visit www.openaccessweek.org. You can follow the conversation on Twitter at #OAWeek.

Translations of this announcement are available in Chinese, Hindi, Portuguese, and Spanish. If you are interested in contributing a translation of the this year’s theme or the full announcement in another language, you can find instructions for doing so here.

Graphics for this year’s Open Access Week theme are available at http://www.openaccessweek.org/page/graphics

 

About SPARC
SPARC®, the Scholarly Publishing and Academic Resources Coalition, is a global coalition committed to making Open the default for research and education. SPARC empowers people to solve big problems and make new discoveries through the adoption of policies and practices that advance Open Access, Open Data, and Open Education. Learn more at sparcopen.org.

About International Open Access Week
International Open Access Week is a global, community-driven week of action to open up access to research. The event is celebrated by individuals, institutions and organizations across the world, and its organization is led by a global advisory committee. The official hashtag of Open Access Week is #OAweek.

New features: Better reviewer communication, public article analytics and more!

It’s been an exciting first half of the year for Scholastica. We now have over 700 journal users and we’re continuing to roll out new features to keep improving our software in order to best serve journal editors, authors, and reviewers. Recently, we introduced some updates to both peer review and open access publishing, including:

  • Improvements to how editors and reviewers communicate with each other
  • Easier file downloading for editors
  • Faster journal website load times and public analytics for HTML articles

Read on for the full details!

Journals can set automatic review reminder email frequency

We know that efficient communication is key throughout peer review. The easier it is for editors to check in on reviewers’ progress without inundating them with emails and the easier it is for reviewers to quickly communicate their recommendations to editors the better. To that end, we’ve introduced two new features to improve editor and reviewer communication.

First, we’ve given journals greater control over automated reviewer reminder emails. Now, editors can decide how frequently they want reviewers to receive automatic reminders at each stage of the peer review process — before the reviewer has responded to an invitation, after response but before the review deadline, and once the deadline has passed and the review is late.

The admin editors of journals can now set email frequencies for the following review reminder categories by going to My Journals > Settings > Configuration Options:

  • Reminders to accept outstanding invitations
  • Reminders to submit accepted reviews
  • Reminders to submit late reviews

These options will enable editors to more closely control the cadence of their reviewer outreach before and after assignments are due. For example, if your journal does not want to send reviewers reminder emails to complete their reviews unless they are late then you can elect to not send any reminders to submit accepted reviews and choose to instead only send reminders for late assignments.

Reviewers can set file permissions for feedback form attachments

In addition to giving editors more control over reviewer reminders, we’ve also made it easier for reviewers to quickly designate whether files they are attaching to their review feedback form are intended just for the journal’s editors or for the editors and the manuscript’s authors. Reviewers now have the option to upload any accompanying files to either an editors only section or an editors and authors section. With this new feature, the intended audience of each reviewer attachment should be clear, helping to avoid back and forth between editors and reviewers as well as the potential of editors forgetting to share attachments intended for the author.

Editors can download all manuscript files at once

It’s also easier for editors to access the manuscript files that they need. We know that downloading manuscripts with multiple attachments can be cumbersome, so we’ve made it possible to download a manuscript and all of its accompanying files with one click, in addition to the ability to download individual files. Now, when editors go to a manuscript’s work area they will see a “download all files” link. Click the link and get everything you need!

Source: https://blog.scholasticahq.com

Declaration of Rights and Principles to Transform Scholarly Communication

Declaration of Rights and Principles to Transform Scholarly Communication

  1. No copyright transfers.
  2. No restrictions on preprints.
  3. No waivers of OA Policy.
  4. No delays to sharing.
  5. No limitations on author reuse.
  6. No impediments to rights reversion.
  7. No curtailment of copyright exceptions.
  8. No barriers to data availability.
  9. No constraints on content mining.
  10. No closed metadata.
  11. No free labor.
  12. No long-term subscriptions.
  13. No permanent paywalls.
  14. No double payments.
  15. No hidden profits.
  16. No deals without OA offsets.
  17. No new paywalls for our work.
  18. No non-disclosure agreements.

Source: https://senate.universityofcalifornia.edu/_files/committees/ucolasc/scholcommprinciples-20180425.pdf

JRTDD in Eprints

Respected colleagues,

I have great honor to inform you that Journal for ReAttach Therapy and Developmental Diversities is included in digital repository of Eprints.

What is Eprints?

EPrints has been leading innovation in the Open Access movement over the past 15 years. EPrints provides a set of mature ingest, preservation, dissemination and reporting services for your institution’s OA needs.
Created in 2000 as a direct outcome of the 1999 Santa Fe meeting that decided on the OAI-PMH protocol, EPrints software provides stable, pragmatic infrastructure on which institutions the world over have been utilising to enable their Open Access agendas.
As Open Source Software, EPrints’ greatest asset is the community of developers, librarians and users that feed into its progress and keep EPrints the innovative platform that we are so proud of.
Source: http://www.eprints.org/
In this occasion I would like to thank to my collegues Prof. Dr. Boro Jakimovski and Prof. Dr. Dejan Gjorgjevikj from faculty of FINKI at University Ss. Cyril and Methodius. They gave us technical support for this platform. We hope that through this platform the jounral will be much more visible and indexed.
JRTDD Editor-in-chief

SEEING THE BIGGER RESEARCH PICTURE: THE BENEFITS OF ALTMETRICS DATA FOR AUTHORS

By: Amy Rees, Customer Support Manager at Altmetric

What and why: does article performance data matter?

With an uptick in the number of journals being published, journals can now use article performance data to provide added value for researchers to encourage them to publish with their organisation and reward the choice to have so done. This additional value can manifest in different ways, both in raw numbers such as downloads, or more nuanced data such as discussions of an article in policy documents.

Collating different types of article performance data allows authors to see rich data associated with their publications. Authors can be rewarded for their community outreach and engagement, seeing payoff for the fruits of their labour.

The article performance data also provides a wider picture of how the research is being received and presented online. This allows authors to consider the questions: How are mainstream sources presenting their research? How are laypersons reacting to the research? How are governmental and non-governmental organisations using the research in the “real world”?

One facet of article performance data is the collection and presentation of altmetrics. Altmetrics, or alternative metrics, describe non-traditional attention to scholarly outputs. Altmetrics as an idea covers a wide range of types of attention: news stories, shares and mentions on social media, references from government policy documents or patents, and much more.

Designed to be complementary to traditional bibliometrics (citations between articles, for example), altmetrics can provide a much more immediate, richer picture of who is engaging with a piece of research, and how it was received.

Recognising author engagement efforts

Providing altmetrics to authors, an approach which has become increasingly commonplace amongst academic publishers, can help not only encourage them to disseminate the results from their research, but also to see the positive effects of doing so. Outreach and engagement by an author is a key factor in increasing the type of attention paid to a publication or the quantity of engagement. As authors take the time to blog, engage with readers on Twitter, do interviews with news programs, and address comments on public peer review forums, they are creating a conversation about their work that is not without considerable effort.

Beyond viewing the impact in the context of journal performance, this type of dissemination also allows for audiences outside of academia to develop their understanding of key issues that impact society. The effort authors put into making their research available and easy to understand is not without benefit to the academic community, whose funders often rely on public donations or whose institutions may seek to raise their profile in a specific field.

Undertaking this kind of broader engagement, and tracking its outcomes, is also increasingly used by individual researchers looking to demonstrate the influence of their work to potential funders, hiring committees, or as part of national research performance reviews.

Staying on top of the story

News coverage, now more than ever and whether true or false, dominates the public understanding of research. Popular science is discussed in major newspapers and dissected in opinion pieces. One journal article can be discussed in multiple news outlets within a short period of time, even with conflicting stories or perspectives. New research published in Science “The spread of true and false news online” indicates that false news stories are shared at a much higher rate than those based in truth. This means that “getting ahead of the story” is critical to authors and the communications teams that support them to ensure their research is being properly positioned.

If an author doesn’t have access to this news data they might not see a misinterpretation of their research and miss the opportunity to respond or clarify. Further, they might also miss the opportunity to engage with an interested community.

The aggregation of news stories, a type of altmetrics, lets the author keep track of how their research is being positioned and then they can work with the media/marketing team of a publisher to correct any issues or highlight particular feedback.

Likewise, public peer review such as Publons, the source used by Altmetric, allows researchers to see the peer reviews of their publication in an open format. This open data allows users to have a chance to respond to relevant criticism within their own field. Concerns about results, data collection, and other aspects of research can be addressed via public discussions. This encourages inter-group conversations about research and allows more direct feedback about the publication.

Expand data available to authors

Collating altmetrics data can be challenging and time consuming for authors. While a simple search online could highlight some of the news stories about a publication it masks the effort necessary to find a complete picture. Some stories may be available but, a user is constrained by the search engine they are using and what they consider “important results”.

Taking the time to truly understand the attention and engagement associated with a publication can lead authors to arduous searching of multiple platforms and sources. By providing altmetrics, journals are pulling together a snapshot of the online attention available and bringing it into a single place, saving authors time and hassle.

Altmetrics are also a great way to highlight sources that might not be available to authors otherwise. While some sources, such as Twitter and news, could be available to users, other sources are harder to find or simply unavailable to authors.

Finding mentions of a publication in policy documents is often a particular challenge for authors. Not all policy sources or organisations make their publications available in an easy to read format, such as PDF or via Word Document. They can be buried in website archives in older formats or simply hard to locate.

Further, there lies a practical issue with extracting policy references. Where does an author even start? Most governments publish thousands of different policy documents per year and it can hard to know even where to start.

As with policy documents, finding references to journal articles in syllabi is nearly impossible for an individual academic. While authors may be aware of where their articles are used in their own institution or maybe in part of their sector there are many other institutions that may be using their research for teaching. This data could be invisible to researchers who may not even realise the use and breadth of their research.

Highlight readership and academic engagement

While engagement with laypersons is a valuable understanding of dissemination, authors are likely interested in which other authors and academics are reading and considering their publications. Readership data, such as that provided by Mendeley, shows who has saved a paper in their academic library to read or use in a future publication. They also provide geographic as well as discipline data for the readers that have saved the paper. Authors can then see where researchers are saving their publications and which discipline. Additionally, Mendeley readership has been correlated, in some fields, to long-term citations.

Services such as F1000 – Faculty of 1000 – allow users to see which academics have recommended their paper. This data allows users to see that academics have not just saved the paper that they have read it and deemed it of value. Though saving a paper does denote at least interest and future engagement these types of recommendations show a direct engagement with the paper and a public endorsement of its content.

Complementary data

Article performance data should be viewed as interlocking and complementary data, with altmetrics working together with more traditional sources such as downloads, views, and citations. While traditional citations take longer to accrue they represent an important part of the story for understanding the performance of an article.

As with any other article performance data, a high volume of citations does not necessarily mean agreement or quality. For example, the now since retracted paper regarding Autism and MWWR written by Andrew Wakefield et al has more than a thousand citations.

Similarly, download counts and views provide another type of article performance data for authors to have a sense of the immediate response to the paper. While a high number of downloads and views don’t necessary denote agreement, it does display engagement and attention to the publication.

Concluding

Providing altmetrics to authors is more than saying “You have X number of news stories and x number of Facebook posts”, though that can also be valuable attention itself. Altmetrics data allows journals to provide a more complete picture of the attention that has been paid to an author’s publication. From laypersons to science communicators to other academics and everyone in between, article performance data is a key source of valuable data for journals to provide to authors.

Article performance data is not only about addressing potential issues and positioning the research in the media, it is also about allowing authors to see the whole story. Providing a variety of different data allows authors to see areas, both disciplinary as well as geographic, that have shown interest in their publication and building connections to others who might be interested. In adding article performance data to the author package a journal is not only giving an author data, they are showing the value of a publication beyond its appearance in that journal.

Source: https://against-the-grain.com

How to define authorship

The prevailing standard for defining authorship in scientific publishing comes from the International Committee of Medical Journal Editors (ICMJE). These standards are broadly applicable in journals across disciplines and are a great place to start when creating or iterating on your authorship policy. According to the ICMJE, an author is someone who meets all the following criteria:

  1. Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work
  2. Drafting the work or revising it critically for important intellectual content
  3. Final approval of the version to be published
  4. Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved

In short, each author should have made an important contribution that enabled the study to be completed, be aware of how the results were presented, and be willing to stand up for the final manuscript. Beyond your policy for inclusion, it is also best practice to indicate authorship practices that you consider unethical, such as

  • Guest/honorary authorship: inclusion of someone who did not contribute in order to capitalize on their name recognition or out of a sense of obligation
  • Ghost authorship: omission of a rightful author from the final list

To guide the corresponding author to carefully consider whether someone qualifies for authorship, consider asking him or her to indicate the contributions that each author has made to the paper. The recently defined CRediT taxonomy has been used by several journals as a way to clearly demonstrate each author’s role on a given paper.

Presenting the CRediT taxonomy criteria (or a version of them that is appropriate for your journal) front and center keeps your authors on the same page as you. Authorship is incredibly important to career advancement for researchers, so it is important for journals to take it seriously and apply fair and consistent standards to all published works.

Author order

In a handful of fields, authors are listed alphabetically. (These are the easy ones!) However, in many others, the order in which authors are listed has implications for the authors. The first author is generally considered to be the primary contributor, and the last author may be seen as providing general oversight and direction (as the head of the lab, for example). Authors in the middle have contributed sufficiently to be listed on the paper, but perhaps in more limited ways than the primary authors.

To prevent what can be a long, protracted dispute later, it is best to ensure that author order is correct when you first receive a manuscript. The ICMJE recommends getting confirmation from every author listed on the paper that they contributed to the work and agree with the order in which they appear on the author list. Even if this is not possible or practical, be sure to require the corresponding author to confirm that they have verified the final author order with all other authors.

Contributorship

Researchers, or anyone else who has contributed to a paper in a meaningful way, who fall short of the requirements for authorship should still be recognized for their work if possible. Often this takes the form of an “Acknowledgments” section. Although contributorship does not have the career implications that authorship does, it is still a public recognition of work that contributors will appreciate and can benefit from.

Some examples of contributorship include the following:

  • General oversight of a research group
  • Administrative or technical support
  • Writing and editing assistance
  • Assistance in conducting research or analyzing data, but without substantially affecting study design or interpretation (e.g., transcribing survey results)

Source: https://against-the-grain.com