Wednesday, July 30, 2014

Impact Factors 2013 for Medical Informatics Journals released (July 2014)

(Toronto, 29 July 2014) Today the 2013 journal impact factors (JIF) were released by Thomson Reuters.
For the fifth year the Journal of Medical Internet Research (JMIR,, the flagship journal of JMIR Publications, is once again ranked as the leading (#1) journal in its discipline, with a 2013 impact factor of 4.7, out of 25 leading medical informatics journals. These results hold even if the impact factor is corrected for journal self-cites. The impact factor 2013 measures how often articles published in the last 2 years (2011-2012) were cited in 2013, and is (for better or worse) an important metric for academics when deciding where to submit their best work.
To put this in perspective, an impact factor of 4.7 is roughly the impact factor of Annals of Medicine, a well respected general medicine journal. JMIR would be on rank 17 of all general medical journals (NEJM, JAMA, The BMJ etc), if it were listed in the general medical journal category.
This also once again puts JMIR clearly ahead of the runner-up, JAMIA, published by the BMJ Group for the American Medical Informatics Association (AMIA), which has an impact factor of 3.9. Another important Open Access (general medical journal) competitor, Plos One, is continuing its decline and has now an impact factor of 3.5. Elsevier's International Journal of Medical Informatics is ranked 4th with 2.7. JMIR's impact factor also once again puts it ahead of all 48 specialist journals of the BMC series (published by Springer's BiomedCentral), including BMC Med Inform Decis (1.5), and ahead of all 47 Hindawi journals regardless of discipline (Hindawi and BMC are well-known Open Access publishers). Schattauers Meth Inform Med (1.1) and Applied Clinical Inform (0.3) are at the bottom of the field in the medical informatics category.

JMIR is also ranked #4 in the large health care sciences & services category (which lists 85 journals), ahead of respected journals such as Health Affairs.
Equally remarkable is the fact that JMIR is now the largest journal among all medical informatics journals, with 285 articles published in the period for which the 2013 JIF was calculated. Only Stat Med (which is a statistics journal) has published more articles than JMIR. Since 2012, JMIR is the only peer-reviewed journal that is publishing daily (every weekday), and expects to publish over 500 articles in 2014 alone, including articles in a dozen new sister journals.

Once again, this is a major achievement for a small publisher, which was founded independently by a leading academic in the field, Prof Gunther Eysenbach (an elected Fellow of the ACMI), who in the late 90ies saw an emerging field and helped founding and shaping it by giving leading scholars a platform to disseminate their work in journals and conferences. "Our success has not only defined and given credibility to a new field of research, which we called "medical Internet research" (and which others now call digital health, participatory medicine or health 2.0), but has also put pressure on traditional publishers in the field to make more research open access", Eysenbach says in a statement released Tuesday. "We also welcome that other publishers are now entering the field with similar journal titles related to Internet interventions and digital health, which is a sign that the field has matured and that the success of JMIR has inspired others.".
Despite the fact that JMIR came out in the JIF ranking on top in the past 6 years, Eysenbach remains very critical of giving too much weight to the impact factor, and says that "Despite the importance of the impact factor, we are actually not making our decisions based on Impact Factor considerations. More important is a fit with the field and innovation - we are not a traditional medical informatics journal, but have a very applied and also patient/consumer-oriented focus, in that we are less interested in papers that report innovations for clinicians - these are referred to our new sister journals." He recommends that clinical informatics papers should be submitted to JMIRs new sister journal JMIR Med Inform, which focusses on traditional medical informatics topics as well as emerging topics like big data in medicine. Eysenbach continues to educate his authors that the obsession with the impact factor should not guide their decision to insist on publication in the original JMIR: "Unfortunately, it takes many years for new journals to get indexed, and I agree with the DORA statement which says that over-reliance on the impact factor is harming innovation and progress. New emerging areas of research such as serious games or mhealth/uhealth, for which we have created sister journals, are not yet covered by the SCI/JIF, which is years behind. We have published seminal works in our sister journals such as JMIR mHealth, where individual articles have attracted significant citations, but which will have to wait for an impact factor for a few years. We urge our authors to make a decision on where to submit not solely on the basis of the impact factor, and to consider our sister and partner journals as well. We are confident that once indexed by SCI they will come out on top of the field as well.", says Eysenbach.

JMIR Publications, the leading publisher for digital health, continues to grow, now has offices in Toronto and Hongkong, and publishes a dozen journals at the intersection between health and technology/innovation, including JMIR Research Protocols, JMIR Serious Games, JMIR Medical Informatics, interactive Journal of Medical Research, JMIR mHealth and uHealth, JMIR Mental Health, JMIR Human Factors, JMIR Rehabilitation and Cyborg Technologies, Medicine 2.0, and others. Other titles such as JMIR Public Health, JMIR Cancer, JMIR Bioengineering and JMIR Nanomedicine are in preparation. JMIR Publications also produces the leading academic conference series in Internet research, social media, and mhealth (Medicine 2.0: JMIR Publications was a cofounder of OASPA (Open Access Scholarly Publishers Association) and is committed to highest quality and ethical standards, as well as quick turnaround times. JMIR Publications was the first open access journal in the field, was the first journal implementing open peer-review and was the first journal publishing Article Level Impact Metrics based on social media ("Twimpact Factor") alongside its articles. JMIR Publications continues to innovate and is involved in a new startup TrendMD, an novel academic discovery and dissemination platform for publishers and academic authors (

Wednesday, December 11, 2013

On the JASIST Haustein paper on tweets and citations

A new paper in JASIST by Stefanie Haustein, Isabella Peters, Cassidy R Sugimoto, Mike Thelwall, and Vincent Lariviere (Tweeting Biomedicine: An analysis of Tweets and Citations in the Biomedical Literature) looks at Tweetations (citations in tweets to scholarly articles).
It proposes to use Twitter as tool to measure "social impact". Sounds familiar? Then because it is because I wrote about this 2 years ago in my seminal paper "Do Tweets predict citations?" (with one of the authors of the JASIST papers having been the peer-reviewer). The Journal of Medical Internet Research (JMIR) - probably the journal which is the highest tweeted journal - was in fact the pioneer in collecting what is now known as altmetrics (Disclosure: I am the founding editor and publisher of JMIR).

This is what I wrote at
"The true use of these metrics [twitter metrics] is to measure the distinct concept of social impact. Social impact measures based on tweets are proposed to complement traditional citation metrics. The proposed twimpact factor may be a useful and timely metric to measure uptake of research findings and to filter research findings resonating with the public in real time.  
"It should be noted that prediction of citations is not necessarily the end goal, and that lack of correlation is not necessarily a failure, because it is clear that these metrics add a new dimension of measuring impact."

While in the case of JMIR, I did see an association - there was a dichotomous relationship between highly tweeted articles and highly cited articles, I never expected this to hold true across all disciplines and journals. The JASIST paper does not contradict my earlier paper. In fact, the JASIST authors found statistically significant correlations between tweets and citations, just like I did: 
"Correlations between Twitter coverage and Twitter citation rates with traditional bibliometric indicators for journals were positive and significant, with rates between .223 and .312. Comparing formal citations and Twitter citations for all papers published in 2011, we found a low but positive correlation of .183, which suggests that, although both indicators are somewhat related, they mostly measure a different type of impact.".  (Haustein 2013)

I agree with this. Although the authors miss an opportunity to discuss my JMIR paper in this context, where I found a Spearman correlation of 0.22 and wrote: 

It should be stressed again that one should neither expect nor hope for perfect correlation. Tweetations should be primarily seen as a metric for social impact and knowledge translation (how quickly new knowledge is taken up by the public) as well as a metric to measure public interest in a specific topic (what the public is paying attention to), while citations are primarily a metric for scholarly impact. Both are somewhat correlated, as shown here, but tweetations and citations measure different concepts, and measure uptake by or interest of different audiences (Figure 12). The correlation and mutual interaction between these audiences is illustrated in Figure 12 with bidirectional arrows. (Eysenbach 2011)

Moreover, I feel that Haustein and colleagues were sloppy in their data acquisition (unexplainable omissions of important journals such as JMIR, Plos, BMC journals) and missed an opportunity to dig deeper into the data (which were, by the way, provided by, rather than collected by the authors).
  • first, they excluded JMIR articles, where an association has previously been found, and also other important journals (possibly all electronic-only  journals, which have articleIDs rather than page numbers), which renders some of their data (table 1, table 2, fig 1) invalid.
  • secondly, they did not look beyond Spearman correlation coefficients and missed an opportunity to analyze the data in the way I did, which is to get rid of the noise by dichotomizing the data (highly tweeted/highly cited vs lower tweeted/cited). Social media signals are messy, and signals can be missed if you don't look at them carefully.
Because of these serious omissions, I do not agree with the overall tenor of the article that implies that tweets are only about humorous or funny articles and are useless to measure aspects of scholarly impact.

First, and perhaps most disturbing, for some unexplained reason they excluded the Journal of Medical Internet Research (JMIR) (without saying this in their methods, but in an email to me they explained it with "problems in their matching process") as well as other major journals such as Plos [UPDATE: it looks like they inadvertently excluded ALL electronic-only journals as these have an article identifier rather than page numbers, and they failed to map these against altmetrics data]. This is more than just a minor oversight, because JMIR is probably the most tweeted journal, and the first and only journal where a association between highly tweeted articles and highly cited articles was previously found (Eysenbach 2011), and Haustein and colleagues are certainly aware of this earlier work. Plos One is the largest medical journal in the world, and was omitted as well. BMC journals (another large open access publisher) are nowhere to be found. Haustein and colleagues claim that Nature is the highest tweeted journal (13.000 tweets for articles published between 2010-2012), while our internal data show that JMIR has attracted approximately the same number of tweets (and far more on a per-article basis). Plos articles (and in fact open access articles in general) are much more widely tweeted than articles from subscription journals, so their inadvertent omission of electronic journals introduces a huge bias. Haustein et al present the top 15 highly tweeted articles of the entire PubMed-indexed literature, but if they would not have omitted JMIR, at least 4 JMIR papers should have been among them (see JMIR Top Articles and figure below, compare these to their table 2 below). 
Incidentally, my paper "Can tweets predict citations" has attracted over 1300 tweets to date (and is also highly cited), while the top tweeted article in their entire dataset has 963 tweets (see their table 2). This cannot be explained by different data collection methods, because shows similar tweets for JMIR papers compared to the ones we collected (shown below).

I find the exclusion of the journal where a strong association has previously been found a disturbing oversight which may have altered their overall conclusions. Not only would it have changed their table of the 15 most tweeted articles, but also their figure 1. In the corrected figure 1 (below) I added where JMIR would be, with a coverage rate of 99.7% of articles being tweeted and a mean of 33 tweets per article. It should also change their overall conclusions, as it is obvious that in certain areas there is much more value to tweets as an early indicator for scientific impact than they acknowledge.  

If JMIR is missing from this dataset, then what else is missing? Where is Plos One in all this, the medical journal which publishes the most articles? Plos journals are nowhere to be found in their paper, and this omission cannot to be explained by their exclusion criteria. The apparent unexplained omission of data (including the most tweeted journal - JMIR - and the journal with the most articles - Plos One -) is disturbing and in my view justifies a retraction or correction. The authors acknowledged in an email to me "technical problems", and usually this would require them to correct or retract the paper, if discovered after publication.

Secondly, there are problems with the analysis, and I think the JASIST authors missed a major opportunity to do a thorough "by journal" analysis, rather than just looking at all articles as a whole. Ideally, they would have (with attribution) replicated the algorithm and analysis I did with JMIR for all other journals out there, using the metrics I used: Rather than "total tweets", which are a function of time and which need to be seen relative to similar articles from the same journal, I defined the twimpact factor and twindex, defined as the number of tweets within the first 7 days after publication (TWIF7) and the rank percentile compared to the 20 previous articles published in the same journal (TWINDEX). I proposed these metrics for the exact reason because raw tweet counts are useless if they are not adjusted at least for time and journal. Most importantly, the JASIST authors did not stratify by journal, but looked at "disciplines", which is a stunning oversight, given the major confounding effect of the journal. Nobody ever argued that there is a linear correlation between tweets and citations across ALL articles in the scientific literature. As depicted in Fig 12 at, the journal is a major confounder because things like journal visibility, journal marketing and journal accessibility are major variables influencing how often an article is tweeted, which these authors did not adjust for. For example, I would expect that open access journal articles attract more tweets than non-open access articles. Secondly, they only look at linear rank correlations (which were not very strong even in the JMIR dataset, Pearson correlation 0.22 for JMIR), while I dichotomized the data and wanted to see if highly tweeted articles are predictive for highly cited articles (this is a major difference in the analytical approach), which is how we should look at how the social media signal can be used. 
What needs to be done is to look at the most highly tweeted articles of a specific journal, and then check whether they are more likely to be highly cited articles in that journal compared to other articles in that same journal. This is what I did (and I would expect this to be true for some other journals other than J Med Internet Res, especially in the medical informatics discipline).

The Spearman correlation coefficient in my dataset for JMIR was 0.22 (between tweets and Web of Science citations). They actually found higher, statistically significant correlations of 0.265 for the Lancet and 0.280 for Nature Biotechnology (Haustein, personal communication), indicating that if they would do a proper within-journal analysis, they might have been able to see similar results to what I found, for some journals. A proper analysis would mean including all articles, not just the tweeted ones (they did a shortcut by not including non-tweeted articles in some of their analyses), using Google Scholar citations instead of WoS citations (which I found to be better correlated), and using the twimpact factor, i.e. tweets within the first 7 days of publication (which adjusts for time), and the Twindex (which adjusts for journal and seasonal effects), and dichotomizing results into highly tweeted/highly cited articles.

While the Journal of Medical Internet Research was the first journal collecting and displaying tweets (since 2008), most journals today do so. This certainly shows that both readers and publishers find value in these metrics, although all journals except JMIR still use relatively meaningless raw tweet counts rather than adjusted metrics such as the Twimpact Factor. 
In my 2011 paper (which authors of the JASIST paper fail to discuss) I suggested that tweetations are a "timely metric to measure uptake of research findings and to filter research findings resonating with the public in real time.". I specifically referred to these metrics as social impact metrics, rather than as a replacement for citation metrics, which is in some cases weakly correlated with citations, but fundamentally measures something differently. It does not make sense to assume (and I never heard anybody suggesting it) that there is a universal correlation between tweets and citations, although for some journals there may be, if their articles generally attract a sufficient number of tweets. By not looking beyond the Spearman correlation coefficient of unadjusted tweets to unadjusted citations, the authors of the JASIST paper missed a major opportunity to answer the question how the data look like for other journals, if the same carefully devised methodology and metrics are used as I used for JMIR. The omission of JMIR articles and probably also other journals (PLOS One?) from their dataset are further major concerns.

Update 1 (11/Dec/2013): Haustein and colleagues now confirmed in an email to me that they not only excluded JMIR (the journal with the most tweets per article, around 13.000 tweets to articles published between 2010-2012, which would constitute about 4% of the tweets they looked at) because of "technical issues", but also Plos One (the worlds largest medical journal, which I believe publishes 30.000 article this year), or all BMC journals (the largest publisher of open access articles). I suspect that their "technical problems" is an omission of articles that have article identifiers (e...) instead of page numbers, as is the case for all electronic-only journals, because they did a sloppy mapping of altmetrics vs Web of Science data. If this would be the case, this would exclude all electronic and/or open access articles, which would introduce a serious bias, as I suspect it is mainly OA articles that are tweeted . Asked whether these errors wouldn't justify a retraction or at least a correction of the paper, they told me they wouldn't because "it would not change the general picture or the conclusions". It's always interesting to come across scientists who know the conclusions before looking at the data... What I do know is that their tables and figures are not worth much without including the most tweeted journals.

Update 2 (11/Dec/2013): I was trying to think of an example to illustrate the differences in the analytical approach - Haustein et al. used Spearman correlation coefficients (ranking articles by tweets, and ranking articles by citations, expecting that every rank matches on a global article level), while I in my JMIR paper dichotomized the data (highly tweeted vs less tweeted, trying to predict the most highly cited articles in a journal), adjusted by journal. I did this because I did not expect a linear correlation between tweets and citations, but I do think the fact that an article is highly tweeted has predictive power for how well an article does in terms of citations, relative to other articles in the same journal.
Why do these different analytical approaches yield different conclusions? An example may illustrate how absurd their approach is: Consider the correlation coefficient between the number of chocolate bars eaten per year by any specific person in the world, and body weight (e.g. make a table with two columns for each person: Number of chocolate bars eaten, and weight. Then rank them by each criterion and see if any given person is on the same rank by both criteria). The Spearman correlation coefficient of this would be very low on a global level, because people don't only get fat by eating chocolate, and different countries have different "normal weights" and eating habits (in some countries chocolate may not be drug of choice), not to talk about the different age groups included in the table. This is basically what they did - look at a "global" correlation. But now stratify the data where you look at a country-level or even city level, adjust by age, and group all people of the same age who are the top 25% chocolate eaters (25th percentile). Are these people more likely to be in the top 25% percentile of weight within their age group and city? I am sure they will be - the "odds" to be in this higher weight group will be much higher for the top chocolate eaters. In any tweet/citation analysis, the data must be adjusted (stratified) by journal. In my tweets vs citation analysis (for one specific journal), I found the odds ratio to be 11, i.e. highly tweeted articles were 11 times more likely to be in the highly cited group. This is a pretty strong association, even though the initial correlation of unadjusted data was low (Spearman correlation coefficient in my dataset: 0.22). Now, the Spearman correlation coefficients in their data was higher for some journals, yet they conclude tweets are useless to identify scholarly important papers.  With tweets being sparse events (and the majority of articles not getting many tweets, which introduces a considerable "noise" at the bottom-end of less tweeted articles) the correlation coefficient is expected to be low. But this does not mean that highly tweeted articles don't have any correlation to high scholarly impact.

Saturday, October 5, 2013

Unscientific spoof paper accepted by 157 "black sheep" open access journals - but the Bohannon study has severe flaws itself

This week, the journal Science has published a news article about the apparent lack of proper peer-review at many open access journals.
The author, contributing Science reporter John Bohannon, concocted a spoof paper with scientific problems, which ended up to be accepted in 157 out of 304 open access journals.
As the editor and publisher of a leading open access journal (Journal of Medical Internet Research JMIR - not to be confused with JIMR = Journal of International Medical Research, published by SAGE which accepted the spoof paper), which is ranked #1 by impact factor in its Thomson Reuters category (ahead of journals published by Elsevier, BMJ, etc.), I too was a target of this spoof, having received several emails from this fictitious author trying to bribe get me to accept his article. For the record: I didn't even send it out for peer-review as it was out of scope. Oddly enough, these emails and my journal don't even show up in Bohannon's data appendix and in his denominator. Also, Bohannon writes he excluded journals which charge a submission fee, but yet, his data appendix of 314 lists several journals that do charge submission fees. If they were excluded, why do they show up in his data appendix? Was our journal excluded, even though he did send me the paper by email and tried to convince me to accept it by paying money?
Bohannon's data are inconsistent, and this article would not have survived peer-review, not only because it is scientifically unsound (making sweeping conclusions without a control group, and sampling a non-random sample of questionable publishers) and ethically questionable (he never even got back to me apologizing for the spoof or even disclosing that the paper he sent me was a spoof), but also because of obvious inconsistencies in the data. Ironically, this paper is about lack of / failure of peer-review, yet this "study" itself is not peer-reviewed (and it would have been unpublishable in a serious, peer-reviewed journal). Despite circumventing peer-review, it is widely disseminated through press-releases (bypassing the usual "scientific" process).
While I appreciate that this study generated a list of open access publishers which have low-quality or no peer-review (see below), it should also be said that the overarching implied conclusion - that open access as a business model is flawed, or that OA journals are of generally lower quality than subscription journals, is outrageous.
What bothers me most is that we can't even refute/replicate this study (by also sending the spoof paper to subscription journals) as no ethics board in the world would approve such a blatantly unethical "study" (using deception and wasting the time/resources of hundreds of journals and academics), so it remains what it is - a piece of bad, sensationalist journalism, unfortunately published in a journal called "Science", implying a scientific study.

Among other problems described above, Bohannon fails to point out that ...

1) legitimate Open Access publishers are the main victims of the scammers and fraudsters that call itself OA publishers but are nothing more than criminal organizations.

2) The "journals" tested here are largely criminal organizations from Beall's list (this is a list if questionable OA publishers. Long before Beall, I started my own list in 2008 by blogging about questionable/unethical publishers like Bentham or Dove, which sure enough also ended up accepting the spoof paper). Anybody who wants to have a list of legitimate OA publishers can easily consult the list of OASPA members (Open Access Scholarly Publishers Association), an organization I co-founded out of the exact same concerns around quality. While I think OASPA can and should do more to ensure quality of its members, I do think being an OASPA member is a pretty good predictor for being a legitimate publisher. I hope however that OASPA is taking swift action against some of its members which published the paper, including Dove and Sage (I am not involved in the membership committee, and unfortunately, Dove recently became a member of OASPA, despite my previous concerns. I would advocate a 2 year membership suspension, after which the OASPA membership committee should submit a test paper to see if the peer-review process has improved).

3) well-known problems with peer-review (reviewers and editors missing critical errors in a manuscript) are not a problem of OA journals specifically. Spoof papers, fraudulent or poor papers, or pharma-sponsored papers have all been accepted by subscription-based journals before.

4) the article implies all "author-pays" OA journals have a conflict of interest by making more money if they accept more articles, ignoring the fact that the simple solution would be institutional memberships with OA journals or submission fees (payable regardless of the outcome). At JMIR Publications, we have experimented with these business models and hope that the scientific community will be increasingly open to such innovative models (we wish that more universities and departments would become institutional members supporting the work of JMIR - institutional membership leads to an article fee waiver on a per-paper basis). There are many possible business models of gold OA beyond "pay on acceptance".

It is foolish to extrapolate these findings of a few black sheep publishers and scammers (mainly based in Nigeria and India, whose journals are generally neither in PubMed nor have an impact factor) to an entire industry. This would be as logical as concluding from Nigerian wire fraud emails that all lawyers who take a fee-for-service are scammers!
There are only two well-known publishers on this list - Sage and Elsevier (the largest publisher of subscription based journals), and it is not without irony that these publishers are primarily publishing toll-access journals (see Appendix).

Gunther Eysenbach MD MPH FACMI
Editor/Publisher, JMIR Publications

Screenshot from showing where most of the bank accounts of the seemingly "International" or "American" journals that accepted the spoof paper lead to - the vast majority being in India and Nigeria.

Update 7-October-2013: 

I made our email exchange with the fake author aka Bohannon available online. JMIR has a submission fee which needs to be paid at the end of the submission process (otherwise the submission is flagged as "incomplete" and the editor doesn't even see it in the submission queue), but we do have a waiver policy for authors from developing countries. I am not arguing here whether or not JMIR or its sister journals like i-JMR (which has a broader scope) should have been included or not in the sample, but the fact that neither JMIR nor the email exchange shows up in the data appendix to the Bohannon study raises several issues and questions:

  1. If JMIR is not listed in the spreadsheet, and the emails are also not included in the data appendix, what else is missing from the data provided by Bohannon? Were there any other journal editors contacted?
  2. If I would have accepted the paper sent to me by email, waiving the submission fee on grounds of being from an author from a developing country, would JMIR have been suddenly included in the list of journals which accepted the paper?
  3. It seems a bit like the inclusion/exclusion criteria were not set a-priori but were made up as he went along (which is a rather unscientific approach). It is clear from our website and submission process that we charge a submission fee, why did he attempt to submit the paper anyways? And why does he later write that these journals were excluded?
  4. There is a bit of an ethical issue here if he submits a flawed paper by email and communicates with me under a false identity, and never debriefs me saying that this was part of an investigation. In fact, any Ethics Board would probably require that a study involving fraud/deception at least debriefs study participants (some may even require retrospective informed consent).   

APPENDIX - Publishers and throw-away/crap journals which have accepted the spoof paper

(Source: Data Appendix from Science 4 October 2013:  vol. 342 no. 6154 pp. 60-65  DOI: 10.1126/science.342.6154.60

publisher journal_URL journal_name
Scientific Journals Journal of Pharmaceuticals
International Scientific Publications Ecology & Safety
KEJA Publications International Journal of Pharmaceutical and Biological Research
Engineering and Technology Publishing Journal of Medical and Bioengineering
International House for Academic Scientific Research International Journal of Advanced Medical Sciences and Applied Research
Indian Society for Education and Environment Indian Journal of Drugs and Diseases
International Journals of Scientific Research Advances in Biological Sciences
Electronic Center for International Scientific Information International Journal of Agriculture: Research and Review  
PharmaInterScience Publishers International Journal of Pharmacy and Biomedical Sciences
Pharmaceutical Research Foundation Journal of Advanced Pharmaceutical Research
Macrothink Institute Journal of Biology and Life Science
World Academy of Science, Engineering and Technology International Journal of Medical, Pharmaceutical, Biological, and Life Sciences
Signpost e Journals Signpost Open Access Journal of Biomedical and Pharmaceutical Sciences
Internet Scientific Publications, LLC The Internet Journal of Herbal and Plant Medicine
Bentham open The Open Bioactive Compounds Journal
Scientific Research Publishing Open Journal of Radiology
JK Welfare & Pharmascope Foundation International Journal of Research in Pharmaceutical Sciences
Innovative Scientific Information & Services Network Bioscience Research
Scienpress Ltd International Journal of Health Research and Innovation
Academic Research Publishing Agency Journal of Pharmacy and Clinical Sciences
Center for Enhancing Knowledge European International Journal of Science and Technology
Asian Economic and Social Society Journal of Asian Scientific Research
Segment Journals Biological Segment
International Network for Applied Sciences and Technology Journal of Plant Biology Research
Science and Engineering Publishing Company Advances in Chemical Science
British Association of Academic Research British Journal of Medical and Health Sciences
Global Journals, Inc. (US) Global Journal of Medical Research
Medwell Online Research Journal of Pharmacology
Electronic Center for International Scientific Information International Journal of Agriculture: Research and Review
GKS Publishers Research in Biotechnology
eJournals of Academic Research & Reviews eJournal of Biological Sciences
Narain Publishers Pvt. Ltd World Journal of Surgical Medical and Radiation Oncology
Biomedical Informatics Publishing Group Bioinformation
ScienceHuB American Journal of Medical and Dental Sciences
International Journal of Advanced Technology and Engineering Research International Journal of Advanced Technology and Engineering Research
Maxwell Scientific Organization Asian Journal of Medical Sciences
BioInfo International Journal of Drug Discovery
African Journals Online African Journal of Biomedical Research
World Scientific and Engineering Academy and Society WSEAS Transactions on Biology and Biomedicine
Scientific & Academic Publishing International Journal of Cancer and Tumor
Euresian Publications Universal Journal of Environmental Research and Technology
Innovare Academic Sciences International Journal of Applied Pharmaceutics
Radiance Research Academy International Journal of Current Research and Review
Research and Review
OMICS Publishing Group Medicinal Chemistry
Kobe University School of Medicine, Kobe Kobe Journal of Medical Sciences
Modern Scientific Press International Journal of Modern Biology and Medicine
Dove Medical Press Ltd Drug Design, Development and Therapy
Science Journal Publication Science Journal of Medicine and Clinical Trials
Scientific Research Platform European Journal of Scientific Research
Basic Research Journals Journal of Medicine and Clinical Sciences
Victorquest Publications International Journal of Traditional and Herbal Medicine
Independent Publishing house Professional Medical Journal
Science & Knowledge Publishing Corporation Limited European Journal of Biological and Life Sciences
International Research Journal International Research and Review
Sage  Journal of International Medical Research
Deccan Pharma Journals Deccan Journal of Medicinal Chemistry
World Science Publisher Journal of Science
InTech Open Access Publisher International Journal of Integrative Medicine
Sphinx Knowledge House International Journal of PharmTech Research
World Academic Publishing International Journal of Life Science and Medical Research
Society of Education Advances in Bioresearch
Global Advanced Research Journals Global Advanced Research Journal of Medicine and Medical Sciences
Association of Pharm innovators Asian Journal of Biomedical and Pharmaceutical Sciences
ARPN Journals Journal of Agricultural and Biological Science
International Journal of Current Research International Journal of Current Research
Centre For Info Bio Technology International Journal of Basic and Applied Medical Sciences
e-journals International Research Journal of Medical Sciences
Scholarly Journals International Scholarly Journal of Medicine
Science Park Journals Journal of Medicine and Radiology
International Journals of Multidisciplinary Research Academy International Journal of Engineering, Science and Mathematics
PBS Journals Asian Journal of Medical and Clinical Sciences
Trans Stellar International Journal of Medicine and Pharmaceutical Sciences
David Publishing Journal of Life Sciences
International Journal of Life Sciences Biotechnology and Pharma Research International Journal of Life Sciences Biotechnology and Pharma Research
The Sims Institute Press Ltd Journal of Experimental & Clinical Assisted Reproduction
Lifescience Global Journal of Cancer Research Updates
International Scholars Journals International Journal of Oncology and Cancer Research
International Research Journals Journal of Medicine and Medical Sciences
BioMedSciDirect Publications International Journal of Biological & Medical Research
Pelagia Research Library Der Pharmacia Sinica
Discovery Publishing Group Medical Science Journal
Wudpecker Research Journals Wudpecker Journal of Medicinal Plants
Herald International Research Journals Herald Journal of Medicine and Medical Science
 Industry Research Collaboration Center International Journal on Bioinformatics & Biosciences
Cosmic Journals International journal of Education and Applied Research
Pharmacie Globale International Journal of Comprehensive Pharmacy
CSCanada Advances in Natural Science
Hikari Ltd. Clinical and Experimental Medical Sciences
International Journal of Pharma and Bio Sciences International Journal of Pharma and Bio Sciences
Journal of Chemical, Biological and Physical Sciences Journal of Chemical, Biological and Physical Sciences
International Digital Organization for Scientific Information Global Journal of Pharmacology
Open Access Science Research Publisher International Journal of Medicinal and Aromatic Plants
Association of Physiologist, Pharmacists and Pharmacologists (APPP) National Journal of Physiology, Pharmacy and Pharmacology
Science Instinct Publications International Journal of Pharmacology & Toxicology Science
Science Target International Journal of Herbs and Medicinal Plants
Society for Science and Nature International Journal of Advanced Biological Research
Society of Applied Sciences Asian Journal of Experimental Biological Sciences
Erudite Journals Limited Erudite Journal of Medicine and Medical Science Research
Scholar Science Journals International Journal of Biomedical Research
International Research Journals Agricultural Science Research Journal
Greener Journals Greener Journal of Medical Sciences
Global Research Online International Journal of Pharmaceutical Sciences Review and Research
PharmaInfo Journal of Pharmaceutical Sciences and Research
The Ribeir√£o Preto foundation for Scientific Research (FUNPEC) Genetics and Molecular Research
International Association of Journals & Conferences International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies
Photon Foundation International Journal of Medicinal Plants
Praise Worthy Prize International Review of Biophysical Chemistry
SAVAP International Academic Research International
BioIT international Journals International Journal of Pharmaceutical Applications
Interscience Open Access Journals International Journal of Pharmacology and Pharmaceutical Technology
Advanced Research Journals International Journal of Advances in Pharmaceutical Sciences
SJournals Scientific Journal of Medical Science
International Institute for Science, Technology and Education Advances in Life Science and Technology
Silicon Valley Publishers International Journal of Applied Science and Technology
GVGS Research Journal of Pharmaceutical, Biological and Chemical Sciences
Online Research Journals Online Journal of Medicine and Medical Science Research
Wilolud Journals Continental Journal of Pharmaceutical Sciences
e3Journals Journal of Biotechnology and Pharmaceutical Research
International Journals of Engineering & Sciences International Journal of Basic & Applied Sciences
Global Research Journals Journal of Pharmacy and Pharmacological Research
Pharmaceutical and Biological Society Asian Journal of Pharmaceutical and Health Sciences
World Academy of Research and Publication International Journal of Chemical and Environmental Engineering
Textroad Journals Journal of Pharmaceutical and Biomedical Sciences
Research Publisher Journal of Biochemical and Pharmacological Research
Journal of Chemical and Pharmaceutical Research Journal of Chemical and Pharmaceutical Research
Power Control Optimization Global Journal of Technology and Optimization
Medipoeia Publication Journal of Applied Pharmaceutical Science
Master Publishing Group International Journal of Biomedical Science
Bioflux Advances in Agriculture & Botanics
American Journal of PharmTech Research American Journal of PharmTech Research
The Faculty of Medicine, University of Nis Acta Facultatis Medicae Naisensis
SERSC International Journal of Bio-Science and Bio-Technology
Xinnovem Publishing Group Biomirror
Sciensage Publications Journal of Advanced Scientific Research
PharmTech Publications Journal of Pharmaceutical Science and Technology
INSInet Publications Australian Journal of Basic and Applied Sciences
E Business Navigators Biology and Medicine
Global Academic Society Indian journal of Scientific Research
Medknow Publications Journal of Natural Pharmaceuticals
Elsevier Drug Invention Today
Bio Tech Systems Journal of Biotech Research
OmniScientia Journal of Nephrology and Renal Transplantation
Global Researchers Journals Journal of Physiology and Pharmacology Advances
Sooraj Publications International Journal of Chemical Sciences and Research
Green Earth Research Foundation GERF Bulletin of Bioscience
European Journal of Chemistry European Journal of Chemistry
Hygeia Hygeia
Trendz Publications Journal of Innovative trends in Pharmaceutical Sciences
Indo Global Journal of Pharmaceutical Sciences Indo Global Journal of Pharmaceutical Sciences
International Journal of Research in Pharmacy and Chemistry International Journal of Research in Pharmacy and Chemistry
Pharma Science Monitor Pharma Science Monitor
IJSAT Publishers International Journal of Science and Advanced Technology
Sadguru Publications Journal of Current Chemical & Pharmaceutical Sciences
 International Journal of Chemical and Pharmaceutical Sciences International Journal of Chemical and Pharmaceutical Sciences
IRPN International Journal of Science and Technology
University Ss Kiril and Metodij Macedonian Journal of Medical Sciences
International Journal of Research in Pharmacy and Science International Journal of Research in Pharmacy and Science
Journal of Medical Research Journal of Medical Research