IBM Watson™ Ideas

Welcome to the IBM Watson™ Ideas Portal


We welcome and appreciate your feedback on IBM Watson™ Products to help make them even better than they are today!


If you are looking for troubleshooting help or wondering how to use our products and services, please check the IBM Watson™ documentation. Please do not use the Ideas Portal for reporting bugs - we ask that you report bugs or issues with the product by contacting IBM support.


Before you submit an idea, please perform a search first as a similar idea may have already been reported in the portal.


If a related idea is not yet listed, please create a new idea and include with it a description which includes expected behavior as well as why having this feature would improve the service and how it would address your use case.

Develop a collection of REST APIs and SDKs that allow to automate IBM Watson Knowledge Studio functions

Develop functions that allow to automate IBM Watson Knowledge Studio activities in Python like:

1- Upload entities and relationships

2- Upload dicts associating it to entities

3- Upload files to be annotate

4- Run annotators, specially dictionnary pre-annotator

5- Get sentences to be annotated

6- Set annotations: mention, relationship and coreference

7- Evaluate ground truth 

  • Guest
  • Oct 10 2017
  • Attach files
  • Marc Nehme commented
    October 12, 2017 14:58

    My specific request is to provide an API to automate the deployment of a MLM to WDS. This is for a key partner who is looking to scale & automate Discovery collections using a pre-defined WKS model.

  • Admin
    STEFAN TZANEV commented
    December 14, 2017 17:50

    Marc, we are planning to separate the publishing of trained models (done by WKS human annotators or project managers) from deploying the models to NLU/WDS (done by developers). NLU is working on a MMA (model management API). When this is ready, the API that you need would belong to NLU, not to WKS as WKS will be responsible for developing and publishing the models, not for managing the model deployment across services.

    If you feel that the other WKS API cases are needs, please elaborate on the business/client benefits of having such APIs. I don't see such benefits apart from someone trying to replicate WKS in a different application.

  • Andrew Freed commented
    20 Mar 17:58

    My client needs to be able to upload annotations from their own application into WKS.  They collect new ground truth from a feedback function in their application.

  • Admin
    STEFAN TZANEV commented
    21 Mar 22:09

    borella@webeing.co

    Could you describe the scenario (workflow) that you envision for the API that you describe?

    Where would the artifacts (type system, docs) come from?

    How does step 6 fit into the workflow, and why do you need an API for it? (This is the human annotation activity.)

    What is the user benefit (business value) for your case? Are you saving time, scaling activities, replacing WKS GUI, or something else?

    The more information you provide, better I'll understand your case and better the chances to promote this idea to the WKS roadmap. Thanks.

  • Admin
    STEFAN TZANEV commented
    21 Mar 22:15

    @Andrew Freed

    Making Watson smarter with use (harnessing end user feedback) is a very nice idea. We already have a placeholder in the roadmap, but haven't had time to discuss it seriously. As the NLU is currently working on a NLU Train API, I'm sure that soon we'll have technology to support the continuous learning idea.

    It seems that there are several different scenarios that require WKS API. I think that your idea leans more towards the NLU Train API, because I'm not sure that you need WKS for the use case that you describe.

  • Augusto Borella Hougaz commented
    25 Mar 14:05

    @STEFAN TZANEV

    the way WKS GUI works nowadays the human annotator needs to focus on the annotation activity. This is pretty repetitive and difficult to scale to different users => few people many annotations.

    The idea is create I different GUI where have a different approach allowing => many people few annotations.

    My team and me have already presented the hole workflow to IBM DEG - Developer Ecosystem Group in Brazil.

  • STEFAN TZANEV commented
    30 Mar 13:39

    @Augusto,

    Thank you for the explanation. The idea (business case) is interesting - developing alternative WKS methodology.

    But it does not give me enough understanding of the concrete API requirements that you have.

    1. Could you provide more specifics - what specific tasks/features/capabilities in WKS do you want to be able to access and how (GET, PUT, POST, UPDATE, DELETE,...)?

    2. Would your new annotation model require expansion of the WKS capabilities beyond just offering an API to the current capabilities?

    3. Based on the demonstrated new methodology, do you have any estimates on the typical load for your model? 

    4. Also, because in your case the hypothetical WKS API will be used for a WKS alternative and because we'll still be incurring operational costs, the API could not be offered for free. Are you willing to pay for using WKS API, and if yes, what's the value that you put on it?

    Feel free to reach to me directly if you are not comfortable sharing details in this channel.

    Thanks,

    Stefan

    Lead Offering Manager, Watson Knowledge Studio

  • Laksh Krishnamurthy commented
    24 May 16:10

    @Stefan,

    In working with a number of internal product teams within IBM that have used WKS or in the process of using it this requirement keeps coming up.  Various teams have proposed ideas around their needs. As we look to merge our tooling into Watson Studio there are capabilities within it that should work nicely, especially around the continuous learning techniques.   In addition to the the above, we need the following endpoints 


    a) Ability to evaluate a WKS model given a test set or blind set and produce metrics + confusion matrix

    b) Ability to version and deploy a WKS model

    c) Ability to collect and store feedback 

    d) Ability to provide continuous learning (Enable continuous learning, Provide thresholds for automatic re-training, accuracy thresholds for publishing a model  - Note : All this exist in Studio today for WML)

  • JAUME MIRALLES SOLE commented
    09 Nov 10:06

    Hi. Let me share another use case from a customer, where they are wishing to have WKS apis:
    - Customer is developing a solution to process medical annotations from doctors
    - For a new client (ej. hospital) first thing they do is analysing a relevant set of documents with their own algorithms with Watson Studio with the objective to obtain a set of customized dictionaries for WKS.
    - Then they create a WKS model to train a custom model for such hospital. The way they have designed such training is to leverage pre-annotation with those dictionaries generated by their algorithms. So now they have to get the output files from WS and load them as dictionaries to WKS manually. Ideally with an API that process could be automatised. (create project + define typesystem + upload dictionaries and associate to types)
    - Then they load sample documents to WKS project, pre-annotate with the loaded dictionaries and after a potential SME review, they train and generate the final WKS model, that later is deployed to WDS or NLU. Eventually that could also be automated: load document set, launch pre-train, launch train, deploy.

     

    With a fully automated WKS, one could leverage the pre-training capabilities and generate ML models suited for certain document sets with a set of entities trained solely with dictionaries and letting the SIRE system to apply context understanding. 
    This enable a business model for company that own generic dictionaries for some generic industry entities to generate customer models applying their own dictionaries to customer datasets without having to leave the dictionaries to the customer.