Skip to content

Commit

Permalink
Internal change
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 278487365
  • Loading branch information
jindalshivam09 committed Nov 5, 2019
1 parent ea72cdb commit 0c22c94
Show file tree
Hide file tree
Showing 16 changed files with 2,610 additions and 173 deletions.
23 changes: 23 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# How to Contribute

We'd love to accept your patches and contributions to this project. There are
just a few small guidelines you need to follow.

## Contributor License Agreement

Contributions to this project must be accompanied by a Contributor License
Agreement. You (or your employer) retain the copyright to your contribution,
this simply gives us permission to use and redistribute your contributions as
part of the project. Head over to <https://cla.developers.google.com/> to see
your current agreements on file or to sign a new one.

You generally only need to submit a CLA once, so if you've already submitted one
(even if it was for a different project), you probably don't need to do it
again.

## Code reviews

All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
information on using pull requests.
54 changes: 54 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -199,3 +199,57 @@
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

--------------------------------------------------------------------------------
MIT
The MIT License (MIT)

Copyright (c) 2014-2015, Jon Schlinkert.

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.


--------------------------------------------------------------------------------
BSD-3-Clause
Copyright (c) 2016, Daniel Wirtz All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:

* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of its author, nor the names of its contributors
may be used to endorse or promote products derived from this software
without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
12 changes: 0 additions & 12 deletions examples/Fairness_Indicators_Example_Colab.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -510,18 +510,6 @@
},
"name": "Fairness Indicators Example Colab.ipynb",
"provenance": [
{
"file_id": "/piper/depot/google3/learning/fairness/infra/demo/Fairness_Indicators_Example_Colab.ipynb",
"timestamp": 1572036963849
},
{
"file_id": "/piper/depot/google3/learning/fairness/infra/demo/Fairness_Indicators_Example_Colab.ipynb",
"timestamp": 1571687373989
},
{
"file_id": "/piper/depot/google3/learning/fairness/infra/demo/Fairness_Indicators_Example_Colab.ipynb",
"timestamp": 1571419949116
},
{
"file_id": "1QUN5dPxs1wxYmXUubDPSb5-R_ThhSkV9",
"timestamp": 1571334212446
Expand Down
161 changes: 0 additions & 161 deletions examples/tensorboard_e2e_demo_standalone_binary.py

This file was deleted.

100 changes: 100 additions & 0 deletions tensorboard_plugin/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
# Evaluating Models with the Fairness Indicators Dashboard [Beta]

![Fairness Indicators](https://raw.githubusercontent.com/tensorflow/tensorboard/master/docs/images/fairness-indicators.png)

Fairness Indicators for TensorBoard enables easy computation of
commonly-identified fairness metrics for _binary_ and _multiclass_ classifiers.
With the plugin, you can visualize fairness evaluations for your runs and easily
compare performance across groups.

In particular, Fairness Indicators for TensorBoard allows you to evaluate and
visualize model performance, sliced across defined groups of users. Feel
confident about your results with confidence intervals and evaluations at
multiple thresholds.

Many existing tools for evaluating fairness concerns don’t work well on large
scale datasets and models. At Google, it is important for us to have tools that
can work on billion-user systems. Fairness Indicators will allow you to evaluate
across any size of use case, in the TensorBoard environment or in
[Colab](https://github.com/tensorflow/fairness-indicators).

## Requirements

To install Fairness Indicators for TensorBoard, run:

```
python3 -m virtualenv ~/tensorboard_demo
source ~/tensorboard_demo/bin/activate
pip install --upgrade pip
pip install tensorboard_plugin_fairness_indicators
pip install "tensorflow_model_analysis>=0.15.1"
pip uninstall -y tensorboard tb-nightly
pip install --upgrade tb-nightly
```

## Demo

If you want to test out Fairness Indicators in TensorBoard, you can download
sample TensorFlow Model Analysis evaluation results (eval_config.json, metrics
and plots files) and a `demo.py` utility from Google Cloud Platform,
[here](https://console.cloud.google.com/storage/browser/tensorboard_plugin_fairness_indicators/).
(Checkout [this](https://cloud.google.com/storage/docs/downloading-objects)
documentation to download files from Google Cloud Platform). This evaluation
data is based on the
[Civil Comments dataset](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification),
calculated using Tensorflow Model Analysis's
[model_eval_lib](https://github.com/tensorflow/model-analysis/blob/master/tensorflow_model_analysis/api/model_eval_lib.py)
library. It also contains a sample TensorBoard summary data file for reference.
See the
[TensorBoard tutorial](https://github.com/tensorflow/tensorboard/blob/master/README.md)
for more information on summary data files.

The `demo.py` utility writes a TensorBoard summary data file, which will be read
by TensorBoard to render the Fairness Indicators dashboard. Flags to be used
with the `demo.py` utility:

- `--logdir`: Directory where TensorBoard will write the summary
- `--eval_result_output_dir`: Directory containing evaluation results
evaluated by TFMA (downloaded in last step)

Run the `demo.py` utility to write the summary results in the log directory:

`python demo.py --logdir=<logdir>/demo
--eval_result_output_dir=<eval_result_dir>`

Run TensorBoard:

Note: For this demo, please run TensorBoard from the same directory where you
have downloaded the evaluation results.

`tensorboard --logdir=<logdir>`

This will start a local instance. After the local instance is started, a link
will be displayed to the terminal. Open the link in your browser to view the
Fairness Indicators dashboard.

## Usage

To use the Fairness Indicators with your own data and evaluations:

1. Train a new model and evaluate using
`tensorflow_model_analysis.run_model_analysis` or
`tensorflow_model_analysis.ExtractEvaluateAndWriteResult` API in
[model_eval_lib](https://github.com/tensorflow/model-analysis/blob/master/tensorflow_model_analysis/api/model_eval_lib.py).
For code snippets on how to do this, see the Fairness Indicators colab
[here](https://github.com/tensorflow/fairness-indicators).

2. Write Fairness Indicators Summary using `tensorboard_plugin_fairness_indicators.summary_v2` API.

```
writer = tf.summary.create_file_writer(<logdir>)
with writer.as_default():
summary_v2.FairnessIndicators(<eval_result_dir>, step=1)
writer.close()
```
3. Run TensorBoard
- `tensorboard --logdir=<logdir>`
- Select the new evaluation run using the drop-down on the left side of
the dashboard to visualize results.
Loading

0 comments on commit 0c22c94

Please sign in to comment.